How Predatory Journal Citations Affect Legitimate Medical Publications and the Phenomenon of Citation Contamination

As predatory publishing has become increasingly common throughout the medical publication landscape, knowledge about these practices have increased in turn. Though a majority of researchers are now aware of the threat that predatory publishers pose, this education typically focuses on how researchers can avoid publishing their own research in predatory journals. There’s another threat from predatory publishing that isn’t talked about nearly as often: the complicated ethics and acceptability of citing a paper published in a predatory journal. Here, we’ll evaluate the two primary perspectives on this issue and explore the related phenomenon of citation contamination.

Defining the Problem

One of the primary hallmarks (and dangers) of predatory journals is their insufficient peer review process. To reduce costs and advertise short manuscript decision times, predatory journals will often send manuscripts to unqualified or biased peer reviewers, or they may not send them for peer review at all. As a result, manuscripts published in predatory journals do not undergo the peer review standards that are fundamentally expected within published medical literature.

Research that’s published in a predatory journal isn’t necessarily flawed; many of these articles would have been accepted by legitimate medical journals. However, the fact remains that paper hasn’t undergone the extent of peer review that’s deemed sufficient by the medical community. This creates a dilemma: is it acceptable to cite these sources?

Perspectives on the Issue

Some academicians argue that individual papers with scientific merit should still be citable. Common discussion points in favor of permitting citations of articles in predatory journals include:

  • Individual basis for evaluation. Across the board, experts agree that authors must be much more careful about citing research from predatory journals than legitimate ones. Informal forum conversations between academicians typically support citation after a thorough evaluation of the individual paper’s methods, results, and potential errors.
  • Leeway for publications in the ‘gray zone’ of predatory journalsThis increasingly common term describes journals that display some predatory qualities but are not wholly dismissed by the scientific community. Discussing citations for papers in these journals creates a whole new layer of confusion, as the overall community doesn’t yet agree whether these journals should be avoided at all. In these cases, individual evaluation remains central to the discussion.

In contrast, some academicians take a hard-and-fast approach barring citation to any predatory journals. Supporting arguments for this perspective include:

  • Refusal to support predatory venues. Citations are a primary way by which a journal can continue functioning. By refusing to cite research published in predatory journals by principle, authors can avoid supporting the mechanisms that allow these journals to exist at all.
  • Lack of peer review is fundamentally unacceptable. Regardless of how an author perceives the quality of an individual paper, the fact that it did not undergo peer review from multiple qualified experts means that the paper has not met the community standard for evaluation to determine its quality. Thus, articles in predatory journals must be avoided, regardless of a paper’s individual merit.

Understanding Citation Contamination

Discussions about citing articles published in a predatory journal cannot be separated from the overarching concept of citation contamination. This phenomenon, also called citation pollution, describes the degree to which papers from predatory journals have been cited in legitimate scientific literature, thereby contaminating overall medical knowledge. This can also include citation networks, or informal agreements between authors or editors to cite one another’s papers regardless of whether the citation is scientifically necessary or justifiable, and excessive self-citation practices. There are ongoing debates regarding the extent and severity of the overall medical literature’s contamination.

Across 250 papers published in predatory journals, Björk et. al (2020) found an average of only 2.6 citations per article (compared to an average of 18 citations for peer-reviewed publications in legitimate journals). Similarly, in a small, independent research project, Anderson (2019) identified seven journals with verifiably predatory practices and found these journals had, overarchingly, relatively few citations.

However, Rice et. al (2021) explicitly disagree with the conclusion that predatory journal articles have minimal effects on the overall literature. The authors state that predatory articles distract readers from legitimate research through location bias and that predatory research can pose a significant danger to patients by influencing physicians’ decision-making, especially when articles from predatory publishers are included in systematic reviews.

Key Takeaway: Best Practices for Citations

For the time being, the acceptability of citing a study published in a predatory publisher is unclear. As such, it’s generally better to avoid the problem entirely by not using or citing research from predatory journals in your paper. Overarchingly, industry shifts are needed to counteract the problem, and citation evaluation strategies, such as the proposed HYDRA citation review workflow, should be explored as a standardized practice during manuscript evaluation.

PRW 2023: “Peer Review and The Future of Publishing”

Each year, our team at Cabells celebrates Peer Review Week (PRW) and recognizes the fact that so much of the work we do each day revolves around peer review, which is the backbone of scholarly communication and the key to maintaining research quality. The theme this year for PRW is “Peer Review and the Future of Publishing,” which would be an appropriate theme every year. To work as intended and as needed, peer review will need to continuously adapt and evolve along with publishing.

The importance of peer review to the quality and overall success of a journal can’t be overstated. For a journal to be recognized in the academic or medical community as legitimate, a robust peer review system must be in place. In recent years, the scholarly community has been shown time and again the results of substandard (or nonexistent) peer review. It has also become clear that identifying an effective and efficient model of peer review has proven to be a challenge for publishers.

Our friend, Daley White, a research scientific editor with the Moffitt Cancer Center, has written an excellent piece discussing the current state of peer review and highlighting a few promising alternative strategies. That piece, along with another by Daley discussing the role of generative artificial intelligence in peer review, should not be missed.

The bedrock of scholarly publishing

At its core, peer review is about benefiting the knowledge base by establishing quality control with respect to published research, which is then used to generate more knowledge. By publishing research papers that have been thoughtfully peer reviewed, academic journals make it possible for researchers around the world to learn about the latest findings in their field. This helps advance knowledge and to foster collaboration amongst researchers. Researchers, funders, and the public all expect that research has been reviewed, is sound, and worthy of being built upon.

Peer review helps to ensure published work is high-quality with findings that are accurate and reliable by helping to identify and correct errors, omissions, and biases. Ultimately, authors are responsible for conducting sound research and not fabricating data or results. Unfortunately, the immense pressure to publish along with the industry’s unwillingness to publish null results, both contribute to making this responsibility an insurmountable challenge for some.

For a journal to be considered for inclusion in Journaltyics, our evaluators must have evidence of a rigorous peer review system.

To be effective, peer review must be unbiased and transparent though the extent to which journals are open about their review process varies. Promoting and expanding transparency and accountability in the research and peer review processes shows readers how the paper was evaluated and helps them understand the reasons for its acceptance or rejection, which helps to build trust in the publication process and the research itself.

Time after time

Can it be assumed that peer review is consistently conducted with the necessary rigor when in most cases it is added to the workload of already very busy and time-strapped reviewers? Most workplaces don’t provide an allowance of time for peer review, and there is no compensation for conducting reviews. So, without incentives, peer review is conducted solely to contribute to a knowledge base that needs to be carefully managed and safeguarded.

Along with pressure on scholars to find the time to conduct reviews, there is pressure on journals to review papers quickly. But can speed be reconciled with quality? Speedy peer review, when taken to an extreme, is an indication of the type of substandard or virtually nonexistent peer review often found in predatory journals.

While it’s important to authors that articles are published in a timely manner (which requires timely peer review), there is a correlation between speed and quality that the industry as a whole is working under. Often, the state of a journals peer review process comes down to which journals have more resources available. Not all journals can swing having an in-house statistician to review research statistics on staff. Training in peer review as part of PhD programs would also be valuable – while early career researchers are very knowledgeable in their fields despite being relatively inexperienced, having ECR’s conduct peer review with no training is less than optimal.

So, this PRW we will consider these and other ideas as we continue our work as champions of peer review – and Cabells team member Clarice continues her work as a member of the PRW Steering Committee. Our work at Cabells will adapt and evolve right along with peer review and publishing into the future. What won’t change is the key role played by peer review in maintaining the quality, transparency, and accountability of research and the integrity of knowledge.

Introducing the All-New Journalytics Academic & Predatory Reports

We have some exciting news to share – a new and improved Journalytics Academic & Predatory Reports platform will soon be here. Our team has been working on multiple updates and enhancements to our tried and true platform that will benefit users in multiple ways. Along with our ongoing addition of new verified and predatory journals, users will experience better search results, new data points and visualizations, increased stability and speed, and more secure logins.

In addition to the visual elements and expanded analytics of this redesign, a key component is the full integration of our Journalytics and Predatory Reports databases. This integration will allow for comprehensive searches that present the full range of publishing opportunities and threats in a given area. Our goal is to facilitate journal discovery and evaluation so our users know the journals and know the risks.

Last month we hosted a webinar to give users a sneak peek at the upcoming changes, which include a new guided search page to jumpstart journal discovery, updated platform and journal card designs, and new data points such as fees and article output. Check out the video below or visit our YouTube channel where you’ll find a time-stamped table of contents in the description for easy navigation to specific points in the video.

A preview of the all-new Journalytics Academic & Predatory Reports.

A fresh look with more data

Guided search

The path to journal discovery now begins on our guided search page, where users can

  • search for a journal by title, ISSN, or keyword
  • access our database of legitimate and predatory journals
  • seamlessly sort verified journals with one of our featured metrics shortcuts
  • jump directly to one of 18 academic disciplines
Our guided search page with shortcuts to journal platforms, featured metric sorting, and disciplines.

New fully-integrated platform

Our redesigned platform now features full integration of verified and predatory journals into the same environment. First rolled out as part of our Journalytics Medicine & Predatory Reports platform, the integration has proven to be an invaluable enhancement. Users can feel confident with fully comprehensive search results that capture both legitimate and deceptive publishing opportunities, and they’ll also have the ability to filter out one or the other with just one click.

NO MORE GUESSWORK: Search results now include both legitimate and predatory journals

Search for publications by title, ISSN, disciplines or other keyword and know that we’ve left nothing to chance – verified and predatory journals each have their own design and data type, making clear whether the journal is listed in Journalytics or Predatory Reports.

Redesigned journal cards with new data points

Judging the quality of a journal, the likelihood of manuscript acceptance, publication timelines, and the potential impact of a journal can be difficult. To assist with clear and confident publication evaluations, we have added a few new data points to verified records to facilitate decision-making:

  • Open Access details – copyrights, archiving, and access details
  • Fees – who pays for publishing articles and how much?
  • Article output – how often does a journal publish per year?
  • We have reminaged the visualization of our CCI citation-backed metric, which shows the historical “influence” or citation-activity for each discipline in which a journal publishes.

The new Journalytics Academic will include the beta version of a new metric: The SDG Impact IntensityTM (SDGII)developed in partnership with the Saint Joseph’s University Haub School of Business.

The SDGII seeks to contextualize and understand the relevance of academic research in terms of the United Nation’s Sustainable Development Goals. Climate change, sustainability, and equity are among the most powerful forces for change in society, and yet they are ignored by traditional citation-based metrics. We hope to help lead the charge to change this dangerous oversight.

This is a pilot program that is currently included in a limited number of business journals but will soon be expanded to increase awareness of sustainability-minded journals publishing impactful research.

For more information, see our videos covering the SDGII on our YouTube channel.

New look, same deceptive operations

In recent years, awareness of the nature and scope of the problem presented by predatory publishers has increased within the scholarly community. At the same time, predatory publishers themselves have become more aware of what they need to do to appear to legitimate and avoid detection. Their tactics are evolving right along with efforts to combat them, and their numbers are growing – we currently have reports on more than 17,000 predatory publications.

  • Journal identification – each report provides the title, publisher, discipline(s), ISSN (if available), and website links for journal discovery and confirmation.
  • Violation categories – journal reports monitor the areas in which the deceptive behaviors occurred.
  • Violation severity – reports also track the severity of the deceptive behaviors.

By providing not just identifying information for predatory journals, but also a report on the nature, scope, and severity of their behaviors, we aim to equip our users with an understanding of the varied tactics predatory publishers employ. Our goal is to educate and inform researchers on the different profiles and archetypes of predatory journals we uncover, so they are better able to identify and avoid them as their career continues.

What’s next?

Be on the lookout for further updates on the timing of the release of the updated platform, as well as information on the upcoming new website that will serve as the central hub for all of our resources, complete with a portal for platform access, links to product information and criteria, and our blog.

We will also host another webinar as we move closer to the launch date for a final look at the upcoming enhancements. Stay tuned!

Open Access: History, 20-Year Trends, and Projected Future for Scholarly Publishing

It’s hard to imagine where the scholarly publishing landscape would be today without open access. As we reach two decades from the inception of open access, it’s important to evaluate how this model has revolutionized research and its potential future directions.

A Brief History of Open Access

1991: The beginning of the open access movement is commonly attributed to the formation of arXiv.org (pronounced ‘archive’), the first widely-available repository for authors to self-archive their own research articles for preservation. ArXiv.org is still widely used for article deposition, with over 2 million articles included in January 2023.

1994: Dr. Stevan Harnad’s ‘A Subversive Proposal’ recommended that authors publish their articles in a centralized repository for free immediate public access, leveraging the potential of the up-and-coming internet and combating the rapidly increasing publication costs and slow speed of print publishing (ie, the ‘serials crisis’). Though this was not the first traceable mention of what would become open access publication, it’s widely considered as the start of an international dialogue between scientific researchers, software engineers, journal publication specialists, and other interested stakeholders.

2000-2010: Open access journals began appearing within the publishing landscape. Throughout the decade, a heated back-and-forth debate persisted between open access proponents and traditional non-OA publishershttps://www.bmj.com/content/334/7587/227.

2001: The Budapest Open Access Initiative (BOAI) resulted in a declaration establishing the need for unrestricted, free-to-readers access to scholarly literature. This initiative is considered the first coined use of the phrase ‘open access.’

2003: As a follow-up to BOAI, the 2003 Berlin Declaration on Open Access to Knowledge in the Sciences and Humanities expanded upon the definitions and legal structure of open access and was supported by many large international research institutes and universities.

2013-present: Multiple governments have announced mandates supporting or requiring open access publishing, including the United States, the United Kingdom, India, Canada, Spain, China, Mexico, and more.

2018: cOAlition S was formed by several major funders and governmental bodies to support full and immediate open access of scholarly literature through Plan S.

Current State of Open Access

Today, there are four primary submodels of scholarly open access article publishing:

  • Gold: all articles are published through open access, and the journal is indexed by Directory of Open Access Journals (DOAJ). The author is required to pay an article processing charge.
  • Green: manuscripts require reader payment on the publisher’s website but can be self-archived in a disciplinary open access archive, such as ArXiv, or an institutional open access archive. A time-based embargo period may be required before the article can be archived. The authors are not required to pay an article processing charge.
  • Hybrid: authors have the choice to publish their work through the gold or green open access models.
  • Bronze: a newer and less common option than gold, green, or hybrid open access, bronze open access means that manuscripts are published in a subscription-based journal without a clear license.

Though open access isn’t yet the default for publishing, it’s a widely available option that’s quickly becoming an expected option for journals. Additionally, research funding bodies are increasingly requiring open access publication as a term for funding, such as the Wellcome Trust and the National Institutes of Health.

Since its launch in 2015, the Directory of Open Access Journals (DOAJ) has risen to the forefront as one of the most comprehensive community-curated lists of reputable open access journals. Unfortunately, however, the rise of open access has also enabled a widespread increase in predatory publishing practices, and counteracting predatory publishers is expected to be a primary focus of future open access development.

Open Access Growth Trends

Before diving into the numbers, it’s important to note that open access reporting is unstandardized. Depending on the databases assessed and definitions of open access, document types, and related terms, the reported number of open access articles per year can differ dramatically between reports. However, overarching trends remain relatively consistent across reports.

In 2018, the European Commision of Research and Innovation, an official research group of the European Union, found that 30.9% of open access publications were open access in 2009, which increased to 41.2% in 2016, then slightly tapered off to 36.2% in 2018. As of 2019, 31% of funders required open access publishing of research, 35% encouraged open access publishing, and 33% embraced no overt policy or stance.

In 2022, the Research Information Observatory partnered with the Max Planck Digital Library and Big Data Analytics Group to compile and publish their data paper, “Long Term Global Trends in Open Access.” Their report found that the percentage of articles that are accessible without paywall subscriptions has increased substantially: around 30% of articles published in 2010 were openly accessible, which jumped to around 50% of articles published in 2019.

Future Expectations and Projections for Open Access

The Predator Effect: Understanding the Past, Present and Future of Deceptive Academic Journals

During his time working at Cabells, predatory publishing practices turned into a near obsession for Simon Linacre – so much so, he wrote a book about it: The Predator Effect. Here he shares details of the book, and how predatory journals could form part of a publishing ethics crisis.


In a recent conversation with a senior academic regarding publishing ethics, the discussion veered between predatory publishing, paper mills, paraphrasing software and the question of whether an article written by AI could be regarded as an original piece of work. Shaking his head, the academic sighed and exclaimed: “Retirement is starting to look pretty good right now!” The conversation demonstrated what a lot of people in scholarly communications feel right now, which is that at this moment in time, we are losing the arms race when it comes to research integrity and publishing ethics.

In the last year, we have seen the number of predatory journals included on Cabells’ Predatory Report database approach 17,000, thousands of articles be retracted by major publishers such as Wiley and IoP, and major scandals, such as one I worked on with Digital Science company Ripeta, where one author was responsible for dozens of plagiarised articles. The concern is that many more articles might have leaked into the scholarly communications system from paper mills, and this coupled with leaps in technology that enable students and authors to buy essays and articles generated by AI without lifting a finger themselves. Now wonder older scholars who didn’t have to deal with such technologies are shaking their heads in despair.

Negative Impact

These issues can be rather abstract as they don’t necessarily translate into tangible impacts for most people, but this also means they can be misunderstood and underestimated. For example, what happens when an individual reads about a cure in a predatory journal and tries to use it and makes the condition of a patient worse? Or what about someone qualifying for a position based on coursework they cheated on? There are numerous instances where a breakdown in ethics and integrity can cause major problems.

More broadly, the whole fabric of trust that society has in academic research risks being undermined with so many options open to bad actors if they wish to buck the system for their own ends. We have seen this with the fateful Wakefield article about the MMR vaccine in the 1990s, the effects of which are still being felt today. That was an anomaly, but if people ceased to believe that published research was trustworthy because of these numerous threats, then we will indeed be in a perilous position.

Digital Solutions

The scale of these problems can be seen in three recent publications, which I discussed in a recent talk at the ConTech 2022 Conference in London:

  • In September, The State of Trust & Integrity in Research (STIR) report was published by Ripeta, which outlined some of the issues facing research integrity, and how greater standardisation and investment in technology is required
  • In October, the State of Open Data (SoOD) report was published by Figshare, Digital Science and Springer Nature. It produced the results of a huge survey of researchers which showed open data sharing was only growing gradually, and policymaking needed to be more joined up and consistent
  • In November, The Predator Effect was published – a short open access ebook detailing the history and impact of predatory publishing practices. 

While each of these publications offers some sobering findings in terms of the problems faced by scholarly communications, they also offer some hope that technology might provide some solutions in the future. In terms of predatory journals, this means using not only using technology as one solution, but using multiple solutions together in a joined up way. As I say in the book:

“Using technology to improve hygiene factors such as legitimate references may be another strategy that, if adopted together and more widely, will have a significant impact on predatory journal output.” (Linacre, 2022)

Concerns around trust in science are real, but so is the anticipation that technology can show how scholarly communications can move forward. As a former publisher, I thought technology could easily solve the problem, but thanks to working at Cabells I understood much more work is required in equipping researchers with the right tools, knowledge and know how to avoid predatory journals. In the past, collaboration in the industry has often been slow and not fully inclusive, but this will have to change if a breakdown in research integrity and publication ethics is going to be avoided.

Update: A Journal Hijacking

Editor’s Note: This is an updated version of an article originally posted in August, 2021.


As members of our journal evaluation team work their way around the universe of academic and medical publications, one of the more brazen and egregious predatory publishing scams they encounter is the hijacked, or cloned, journal.  One recent case of this scheme uncovered by our team, while frustrating in its flagrance, also offered some levity by way of its ineptitude. But make no mistake, hijacked journals are one of the more nefarious and injurious operations carried out by predatory publishers. They cause extensive damage not just to the legitimate journal that has had its name and brand stolen, but to medical and academic research at large, their respective communities of researchers and funders, and, ultimately, society.

There are a few different variations on the hijacked journal, but all include a counterfeit operation stealing the title, branding, ISSN, and/or domain name of a legitimate journal to create a duplicate, fraudulent version of the same. They do this to lure unsuspecting (or not) researchers into submitting their manuscripts (on any topic, not just those covered by the original, legitimate publication) for promises of rapid publication for a fee.

A recent case of journal hijacking investigated by our team involved the legitimate journal, Tierärztliche Praxis, a veterinary journal out of Germany with two series, one for small and one for large animal practitioners:

The legitimate website for Tierärztliche Praxis

by this counterfeit operation, using the same name:

The website for the hijacked version of Tierärztliche Praxis

One of the more immediate problems caused by cloned journals is how difficult they make it for scholars to discover and engage with the legitimate journal, as shown in the image below of Google search results for “Tierärztliche Praxis.” The first several search results refer to the fake journal, including the top result which links to the fake journal homepage:

“Tierärztliche praxis” translates to “veterinary practice” in English, and the legitimate journal is of course aimed at veterinary practitioners. Not so for the fake Tierärztliche Praxis “journal” (whose “publishers” didn’t bother/don’t care to find out what “tierärztliche” translates to) which claims to be a multidisciplinary journal covering all subjects and will accept articles on anything by anyone willing to pay to be published:

Aside from a few of the more obvious signs of deception found with the cloned journal: a poor website with duplicate text and poor grammar, an overly simple submission process, no consideration of the range of topics covered, to name a few, this journal’s “archive” of (stolen) articles takes things to a new level:

Above: the original article, stolen from Tuexenia vs. the hijacked version

A few things to note:

  • The stolen article shown in the pictures above is not even from the original journal that is being hijacked, but from a completely different journal, Tuexenia.
  • The white rectangle near the top left of the page to cover the original journal’s title and the poorly superimposed hijacked journal title and ISSN at the header of the pages, and the volume information and page number in the footer (without bothering to redact the original article page numbers).
  • The FINGER at the bottom left of just about every other page of this stolen article.
Predatory Reports listing for the hijacked version of Tierärztliche Praxis

Sadly, not all hijacked or otherwise predatory journals are this easy to spot. Medical and academic researchers must be hyper-vigilant when it comes to selecting a publication to which they submit their work. Refer to Cabells Predatory Reports criteria to become familiar with the tactics used by predatory publishers. Look at journal websites with a critical eye and be mindful of some of the more obvious red flags such as promises of fast publication, no information on the peer review process, dead links or poor grammar on the website, or pictures (with or without fingers) of obviously altered articles in the journal archives.

Can Research Lost to Predatory Journals Be Saved?

At the launch event last month for the InterAcademy Partnership’s (IAP) recently released report on combatting predatory academic journals and conferences, an all too familiar question was posted to the virtual session’s chat by an attendee:

… I made this mistake once and I published a paper in one of these journals … now it does not appear online on searching … how can I withdraw this paper and republish it in a trusted journal??

This is a variation of a question we at Cabells are asked and consider frequently, and one that perfectly encapsulates the scholarly publishing-esque three-act drama that unfolds when a researcher is entangled with a deceptive publishing operation:

Act I: Setup

‘… I made this mistake once and I published a paper in one of these journals …’

The ‘mistake’ made (or in our drama, the ‘inciting incident’) was unknowingly submitting work to and publishing it in a predatory journal. This can and does happen innocently and somewhat easily to unsuspecting researchers, most often students and early career researchers.

Act II: Confrontation

‘…now it does not appear online on searching…’

The stakes are raised as the ramifications of the inciting incident from Act I are realized. One of the damaging results of having research published in a predatory journal is that it won’t be easily (if at all) discoverable. Some predatory journals advertise that they are included in well-known databases like Web of Science, Scopus, or Cabells, when they are not. These operations devote no time or resources to developing SEO or facilitating inclusion in research databases, so published articles will be difficult, if not impossible, to find.

Act III: Resolution

‘…How can I withdraw this paper and republish it in a trusted journal??’

The short answer, as provided in the launch event chat by Susan Veldsman, one of the authors of the IAP report, was succinct and unfortunately accurate:

‘Authors have reported that it is very difficult to retract these articles, actually no chance, as publishers just ignore requests and pleas from authors.’

This is the sad truth. Once an article is submitted to a predatory journal there is little to no hope of successfully withdrawing the article. These requests by authors are either ignored or not acted upon. Once published by the predatory journal, which often occurs without notice, researchers risk running afoul of publication ethics concerning duplicate publication if they submit the article to a second publication, whether or not copyright has been transferred. But should this be the case?

One alternative for dealing with research that has essentially been ‘lost’ to predatory operations, and so dismissed or ignored, was put forth by Jeanette Hatherill, Scholarly Communication Librarian at the University of Ottawa. Hatherill proposes that an author be able to “… retract or withdraw the article, acknowledge its ‘prior publication’ and submit it to a preprint server to make it available for open peer review.” While most preprint servers, including bioRxiv, require that articles be submitted prior to being accepted by (and of course, published in) a journal, Hatherill points out that these policies are set by the preprint servers and can be examined and revised.

As for the question of copyright, Hatherill notes that ‘even deceptive publishers’ such as OMICS, the predatory publishing giant recently on the losing end of a $50 million dollar judgment due to their predatory publishing practices, ‘state that all articles are available under a Creative Commons Attribution license.’ Publishing an article open access under Creative Commons licenses leaves the copyright with the author, meaning from a copyright standpoint it should be permissible to post on a preprint server as long as the place of first ‘publication’ is cited.

This solution doesn’t address the harmful effects of duplicate publication, like skewed citation metrics or flawed research due to redundant results from multiple publications, but the risk is minimal as papers published in predatory journals attract little attention and citations from scientists, especially when compared to those published in reputable publications.

Until a more comprehensive, structured, and widely applicable solution to the dilemma of how to salvage legitimate and potentially valuable research that has been unknowingly published in a predatory journal is found, creative solutions such as posting to a preprint server with an acknowledgment of prior publication and might be the most effective and efficient way to proceed.

IAP Report Sets Out Plan of Action for Fighting Predatory Academic Practices

Stemming the tide of predatory publishing operations is a challenging endeavor. Cabells has witnessed this firsthand through the rapid growth of our Predatory Reports database, which now lists over 16,000 deceptive publications. Advancements in digital publishing have made it easier than ever to launch and operate academic journals and have done much to democratize and globalize research. However, these same advancements have also made it easier than ever to create fake publishing operations that are focused solely on profit, with no regard for scholarship.

Recently, we discussed the importance of ‘researching your research’ and how one researcher’s persistence in vetting a suspect speaking opportunity at a conference traced back to a predatory publisher, Knowledge Enterprises Inc. (KEI), who happened to have six journals included in Predatory Reports). Predatory publishing outfits such as KEI were the focus of the recently released report from the InterAcademy Partnership (IAP), the global network of over 140 science, engineering, and medical academies. The report, “Combating Predatory Academic Journals and Conferences,” was the result of a two-year study to determine what constitutes predatory practices, pinpoint their root causes and drivers, and provide recommendations and guidance on how they can be identified and avoided.

We previously covered the initial findings from the survey of over 1,800 academics on 112 countries, which found that:

  • nearly a quarter of the academics had either published in a predatory journal, participated in a predatory conference, or didn’t know if they had
  • over 80% thought predatory practices were on the rise or a serious problem in their country of work
  • over 80% thought these practices fueled misinformation in public policy.

The study shows that researchers in all countries, at all stages of their career, and in any discipline can be vulnerable to predatory practices, and as a result, raising awareness is now a vital mission for IAP.

The authors identified three main drivers of predatory practices: the increasing monetization and commercialization of the scholarly enterprise, the predominance of quantity-over-quality research evaluation systems, and serious challenges and weaknesses in the peer-review system. To make a lasting and measurable impact on the pervasiveness of predatory journal and conference practices, these root causes, and the unintended consequences that spring from them, require urgent action.

The final section of the report examines the conclusions of the study, including the need for an evolved definition of predatory academic journals and conferences and an increase in the awareness and understanding of predatory behaviors. The study also concludes that predatory operations are on the rise and undermine public trust in research, waste resources, and exploit weaknesses in the peer review system.

Most importantly, the authors set out recommendations for a course of action to combat these harmful and pervasive outfits. Cabells takes seriously the fact that our resources, in particular Predatory Reports, are recommended as trustworthy and effective tools to identify and avoid predatory operations.

Ultimately, the report stresses the need for urgent and collective action among all stakeholders as predatory practices continue to rise at an alarming rate. Training is imperative as is the need for cooperation from all players in taking action on the report’s recommendations. The authors assert that efforts to identify, understand, and expose predatory academic operations must continue, and the root causes of predatory practices need to be addressed if interventions are to have any lasting impact.

BOOK REVIEW: Predatory Publishing, by Jingfeng Xia (Routledge)

During 2021 while Simon Linacre was researching and writing what he thought was the first book on predatory journals, he discovered… someone had got there first. Putting rivalry to one side he reviews the recently published book, which offers in-depth research into a phenomenon which is now stepping out of the shadows.


It is a curious feeling reading a book on a topic that you yourself have written about. During 2021 when I was writing a short ebook on predatory journals (to be published later this year), I heard that Jingfeng Xia – a former academic based in the US – had written a book on predatory publishing that was due out at the end of the year. It was, therefore, with a mix of trepidation and intrigue that I ordered the book as soon as it was released to see what another author had made about the phenomenon. And I wasn’t disappointed.

Predatory Publishing (Xia, 2021) presents an overview of not just predatory publishing practices, but also predatory conferences, journal hijackings and other related deceptive activities. The stated aim of the book is to provide a reference point for researchers, authors and other stakeholders in scholarly communications, and its comprehensive academic research builds a solid base to achieve this. After introducing the topic and giving some necessary background, the meat of the book goes into some detail on predatory journals and predatory publishers, and the market dynamics that have enabled them to develop and prosper.

As you would expect, a good deal of the book focuses on Jeffrey Beall and Beall’s Lists, which are explained and discussed objectively, as are some examples of predatory journal behaviors. Xia also discusses Cabells’ Predatory Reports and other “blacklists”, and the use of this term to describe lists of predatory journals does sit rather uneasily as Cabells and many other organizations have moved away from employing it. Nevertheless, the author looks at this and other lists of recommended journals and does a good job of highlighting how they work and the value they can offer researchers if used wisely. Of particular good use are the inclusion of numerous screenshots and tables of information to fulfil the intention of providing a useful reference for authors, including Cabells’ list of criteria for including titles in its Predatory Reports database.

In terms of publishers, Xia has decided to use several examples of predatory and non-predatory behaviour based on some publishers that were included in Beall’s List. This is particularly instructive as it highlights both accepted predatory publishers and why they were included in Beall’s List (in this case OMICS), but also publishers that were included at one stage but then removed as they were able to show their activities were legitimate (in this case MDPI). By highlighting real examples of publishing behaviours – both deceptive and legitimate – those people hitherto ignorant of predatory publishing practices will be much enlightened.

The rest of the book includes an excellent short chapter on the role journal stakeholders play in predatory publishing, including editors and reviewers who have worked (or have been purported to work) on predatory journals, although of course one of the main traits of such journals is they don’t have any such stakeholders on board. But as Xia notes, “it takes a village to build the predatory publishing market”, and stakeholders other than predatory publishers themselves have been complicit in growing the phenomenon, such as those authors who knowingly publish in the journals to satisfy some requirement or other. Further chapters on predatory conferences, hijacked journals and in particular fake indices are also instructive, and Xia’s dissection of the latter is particularly welcome. Its explanation and presentation of a long list of such indices is perhaps unique in the literature on predatory publishing, and extremely valuable to researchers taken in by data points made to look like Clarivate Analytics’ Journal Impact Factor.

One unfortunate manifestation of reading a book on a topic you are so familiar with is that it is all too easy to spot errors. One such error is in relation to a common myth that Cabells’ Predatory Reports database and Beall’s Lists are in some way linked – they are not. Xia quotes one academic article saying “they [Cabells] do take many articles from Beall’s archive”, and says elsewhere that “unlike Beall’s journal blacklist, which has been taken over by Cabells…”. Both these statements are untrue – Cabells developed its database independently, and while it spoke to Beall as an expert in the area during development, it verified each journal as per its criteria. If there is one criticism for what is an otherwise excellent book, it is that it is rather a cold and dispassionate investigation into the subject that relies a little too much on academic research at the expense of a little journalistic endeavour. Conducting interviews and speaking to stakeholders might have brought the topic more alive, and achieve the author’s aim to provide a much-needed point of clarity on what has always been an all-too-murky subject area.

Xia, J. (2022). Predatory Publishing. Routledge. https://www.routledge.com/Predatory-Publishing/Xia/p/book/9780367465322

Academic Sleuthing

With plenty of advice and guidance on the internet on how to identify and avoid predatory journals, many argue the game is up. However, Simon Linacre argues that while so many authors and journals slip through the net, numerous skills are required to avoid the pitfalls, not the least of which is, as one case study shows, being an amateur sleuth….


Back in the day when I used to lecture researchers on optimizing their publishing strategy, I always used to use the refrain ‘Research your research’ to underline the importance of utilizing the investigative skills of academic research for the purpose of understanding scholarly communications. Knowledge is power, as the saying goes, and knowing how the medium of academic publishing works can enable effective and robust decision-making, especially in academia where those decisions can have a long-term bearing on careers. Knowing the best journals to publish in can prove to be a huge benefit to any given academic.

Turns out knowing where NOT to publish can also have the same benefits.

This notion was underlined to Cabells this month when an academic publications advisor highlighted a case they had been involved in at their university. The advisor – whose identity and that of the institution has been anonymized at their request – was based at a research institute and among other duties advised its researchers about submissions to academic journals, including such things as copyediting, publishing licenses, and open access payments.

Recently, one of the institute’s academics had been invited to present at a conference in 2022, but the invitation was brought to the advisor’s attention as it was a little outside their normal sphere of activity. The advisor thought the invite and presentation were a bit unprofessional and advised against accepting the invitation. Upon further investigation, they found the conference was linked to a suspected predatory publisher, which had been highlighted online in several different sources.

However, the advisor was still not satisfied as while there were suggested links and implications, there was also some evidence of legitimate activities and details. It was only when the advisor scrutinized some of the journals’ articles that she found further evidence of fake journals and scientific anomalies and requested confirmation of their suspicions. We were glad to confirm that the publisher in question – Knowledge Enterprises Inc. (KEI) – indeed looked suspicious and had six journals included in our Predatory Reports database [see image below for example].

Predatory Reports entry for Journal of Economics and Banking from KEI Journals

The moral of this story is not just that ‘researching your research’ can help identify bad actors. It also shows that persistence with an investigation and a wide range of inputs from different sources are required to support ethical publication practices. In some cases, nothing less will do.