The rise and rise of predatory journals and conferences

Editor’s Note: Today’s post is by Tracey Elliott, Ph.D. Dr. Elliott is the Project Director at InterAcademy Partnership (IAP), currently overseeing Combatting Predatory Academic Journals and Conferences.


Predatory academic journals and, even more so, predatory conferences have been given surprisingly little attention in academic circles, despite their rapid growth and sophistication in recent years.  Juxtaposed with the pervasive “publish or perish” research assessment culture, where quantity trumps quality, the research community risks sleepwalking into a perfect storm.  Predatory academic practices are one manifestation of a surge in online scams and deceit that are deluging many sectors, fuelled further by changes in (post-) pandemic lifestyles, but their impact on the knowledge economy, research enterprise, and public policy is potentially profound. 

The InterAcademy Partnership (IAP) – the global network of over 140 academies of science, engineering and medicine – is leading an international project “Combatting predatory journals and conferences” which seeks to better understand the growing menace of these practices, gauge their extent and impact, what drives them and what actions are required to curb them.  With the number of predatory journals now estimated to be at least 14,500 (Cabells) and predatory conferences believed to outnumber legitimate ones (THES), this project is imperative and our recent survey of researchers all over the world is illuminating.

Conducted in November-December 2020, the survey gives concerning insight into the extent and impact of predatory practices across the world.  Based on the 1800+ respondents, two headlines are particularly striking:

1. Over 80% of respondents perceived predatory practices to be a serious problem or on the rise in their country.
2. At least a quarter of respondents had either published in a predatory journal, participated in a predatory conference, or did not know if they had.  Reasons cited for this included a lack of awareness of such scams and encouragement by their peers. Indeed, there is anecdotal evidence to suggest that the use of predatory journals and conferences is embedded, or at least tolerated, in some institutions/networks.

Contrary to some studies citing that early career researchers are especially vulnerable, we found no correlation between a researcher’s career stage, or their discipline, with their likelihood to publish in a predatory journal or participate in a predatory conference.  However, there is a small correlation with the economic status of the country in which they work, with those in lower- and middle-income countries more likely to participate or publish than those in high-income countries. If left unchecked, the research gap between higher- and lower-income countries risks widening. Putting definitive guidance on predatory journals behind paywalls, whilst sometimes unavoidable, risks exacerbating this further.

A challenge for such essential services, whether paywalled or not, is how to distinguish fraudulent, deceitful journals from low quality but well-intentioned and legitimate ones. Whilst bringing the clarity researchers crave, journal safelists and watchlists force an in or out binary decision that is increasingly inadequate and unfair.  In reality, there is a spectrum of fast-evolving and highly nuanced publishing practices that makes Cabell’s and its counterparts’ work very difficult. IAP is currently exploring a subset of Cabell’s-listed predatory journals using internet scraping and spidering techniques for data on predatory publishing.

Our project report, anticipated by early 2022, will include recommendations for all key stakeholder communities – researchers, research funders, publishers, academies and universities, libraries, and indexing services. With IAP as a conduit to academies and research communities throughout the world, we will focus on awareness-raising, training, and mentoring resources, and mobilising governments, multilateral and intergovernmental organisations.

Industrial disease

It’s almost four years since Cabells launched its Predatory Reports database, but the battle to overcome predatory journals shows no signs of abating. As a result, Cabells is constantly developing new ways to support authors and their institutions in dealing with the problem, and this week Simon Linacre reports from the virtual SSP Annual Meeting on a new collaboration with Edifix from Inera, which helps identify articles and authors published in predatory journals.

A common retort heard or read on social media whenever there is a discussion on predatory journals can go something like this: “is there really any harm done?”, “some research is only good enough for those kind of journals,” or “everyone knows those journals are fake.” For the latter rejoinders, there is some justification for taking those perspectives, and if recent global events have taught us anything it is that we need a sense of proportion when dealing with scientific breakthroughs and analysis. But the former point really doesn’t hold water because, when you think it through, there is a good deal of harm done to a number of different stakeholders as a result of one article appearing in a predatory journal.

Predatory journals do researchers and their institutions a huge disservice by claiming to be a reputable outlet for publication. Legitimate journals provide valuable services to both promote and protect authors’ work, which simply doesn’t happen with predatory journals. Essentially, there are three key reasons why authors and their employers can suffer harm from publishing in the wrong journals:

  • Their work may be subject to sub-par peer review, or more likely no peer review at all. The peer review system isn’t perfect, but papers that undergo peer review are better for it. Researchers want to make sure they are publishing in a place that values their work and is willing to devote time and resources to improving it.
  • Versions of record could disappear. One of the advantages of publishing with a reputable journal is that they make commitments to preserve authors’ work. Opportunists looking to make a quick buck are not going to care if your paper is still available in five years – or even five weeks.
  • Published articles will be hard to find. Some predatory journals advertise that they are included in well-known databases like Web of Science, Scopus, or Cabells when they are not. Predatory journals invest nothing in SEO or work to include journals in research databases, so research won’t be easily discoverable.

So, it is in the interests of authors, universities, societies, funders and society itself that research is not lost to predatory publishing activities. Checking against a database such as Predatory Reports will help those stakeholders, but to augment their capabilities Cabells is collaborating with Atypon’s Inera division, and specifically its Edifix product to help prevent ‘citation contamination’. This is where illegitimate articles published in predatory journals find their way into the research bloodstream by being referenced by legitimate journals. With Edifix, users can now vet bibliographic reference lists for citations to predatory journals, as identified by Predatory Reports.

This new Edifix web service with the automated Cabells Reference Checking Tool was showcased at SSP’s Annual Meeting (meeting registration required) this week (and previewed in an SSP sponsored session in October 2020) with a host of other new innovations, collaborations and product developments from the scholarly communications industry. While it would have been great to see old friends and colleagues in person at the event, the virtual format enabled much wider, international engagement which contributed to an undoubtedly successful event.

No laughing matter

The latest meme to sweep Twitter in the last week has been a satirical look at typical journal articles. Simon Linacre introduces Cabells’ own take on the theme and reflects on the impact they can have on our shared conscience.


We all like memes, right? Those social media nuggets that we can all relate to and laugh at, a form of in-joke without having to be with a group of people, which under current circumstances has meant a kind of gold rush for this form of humor. Whether it is the boyfriend looking over his shoulder at another woman or the excerpt from the film Downfall with Hitler going berserk, the number of situations and news items that lend themselves to this form of parody is literally endless.

So, when the meme spotlight fell on our own corner of the scholarly publishing world, we couldn’t help but join in and adapt the scientific paper meme to predatory journals (see image). To be honest, it wasn’t too difficult to think of 12 journal titles that highlighted the problems predatory publishing causes, and a whole series of memes could easily be created to underscore the joke that is the predatory journal phenomenon.

It’s easy to spot the themes we chose to lampoon, although however much we become familiar with the predatory journal tropes, publications and new journals are emerging all the time, as the total number of journals listed in Cabells’ Predatory Reports hitting 14,500 this week testifies. Among the issues we put under the spotlight in the graphic are both the unethical and unaware authors publishing in predatory titles, how poor research or plagiarized content can easily be published, and some of the poor excuses those who end up publishing in dodgy journals have provided.

However, underneath the tomfoolery there is a serious point to be made. A recent op-ed in The Atlantic took the opportunity of highlighting not just the shared joy and geekiness of the scientific paper meme, but also the existential dread it spotlighted. As the article expertly points out, while academics recognize the hamster-in-a-wheel absurdity the meme represents, they cannot help but see themselves in the wheel, unable to stop running. For some, they will just shrug their shoulders and find the next piece of clickbait; for others, there is little consolation in the humor and plenty of angst to try and control to preserve their sanity.

When it comes to predatory journals, from a pure eyeballs perspective we can see that articles and social media posts about the often bizarre world of predatory publishing get the most traction, such as the fact that one predatory journal lists Yosemite Sam on the editorial board. And yet there is always a serious point behind these fun stories, which is that predatory journals can make an unholy mess of scientific research, causing millions of funding dollars to be wasted and allowing either junk or rank bad science to contaminate legitimate published research. This is the real punchline and it rings pretty hollowly sometimes.

No more grist to the mill

Numerous recent reports have highlighted the problems caused by published articles that originated from paper mills. Simon Linacre asks what these 21st Century mills do and what other dangers could lurk in the future.


For those of us who remember life before the internet, and have witnessed its all-encompassing influence rise over the years, there is a certain irony in the idea of recommending a Wikipedia page as a trusted source of information on academic research. In the early days of Jimmy Wales’ huge project, whilst it was praised for its utility and breadth, there was always a knowing nod when referring someone there as if to say ‘obviously, don’t believe everything you read on there.’ Stories about fake deaths and hijacked pages cemented its reputation as a useful, but flawed source of information.

However, in recent years those knowing winks seem to have subsided, and in a way, it has become rather boring and reliable. We no longer hear about Dave Grohl dying prematurely and for most of us, such is our level of scepticism we can probably suss out if anything on the site fails to pass the smell test. As a result, it has become perhaps what it always wanted to be – the first port of call for quick information.

That said, one would hesitate to recommend it to one’s children as a sole source of information, and any researcher would think twice before citing it in their work. Hence the irony in recommending the following Wikipedia page as a first step towards understanding the dangers posed by paper mills: https://en.wikipedia.org/wiki/Research_paper_mill. It is the perfect post, describing briefly what paper mills are and citing updated sources from Nature and COPE on the impact they have had on scholarly publishing and how to deal with them.

For the uninitiated, paper mills are third party organisations set up to create articles that individuals can submit to journals to gain a publication without having to do much or even any of the original research. Linked to their cousin the essay mill which services undergraduates, paper mills could have generated thousands of articles that have subsequently been published in legitimate research journals.

What the reports and guidance from Nature and COPE seem to suggest is that while many of the paper mills have sprung up in China and are used by Chinese authors, recent changes in Chinese government policy moving away from strict publication-counting as performance measurement could mitigate the problem. In addition, high-profile cases shared by publishers such as Wiley and Sage point to some success in identifying transgressions, leading to multiple retractions (albeit rather slowly). The problems such articles present is clear – they present junk or fake science that could lead to numerous problems if taken at face value by other researchers or the general public. What’s more, there are the worrying possibilities of paper mills increasing their sophistication to evade detection, ultimately eroding the faith people have always had in peer-reviewed academic research. If Wikipedia can turn round its reputation so effectively, then perhaps it’s not too late for the scholarly publishing industry to act in concert to head off a similar problem.

Rewriting the scholarly* record books

Are predatory journals to academic publishing what PEDs are to Major League Baseball?


The 2021 Major League Baseball season is underway and for fans everywhere, the crack of the bat and pop of the mitt have come not a moment too soon. America’s ‘National Pastime’ is back and for at least a few weeks, players and fans for all 30 teams have reason to be optimistic (even if your team’s slugging first baseman is already out indefinitely with a partial meniscus tear…).

In baseball, what is known as the “Steroid Era” is thought to have run from the late ‘80s through the early 2000s. During this period, many players (some for certain, some suspected) used performance-enhancing drugs (PEDs) which resulted in an offensive explosion across baseball. As a result, homerun records revered by generations of fans were smashed and rendered meaningless.

It wasn’t just star players looking to become superstars that were using PEDs, it was also the fringe players, the ones struggling to win or keep jobs as big league ball players. They saw other players around them playing better, more often, and with fewer injuries. This resulted in promotions, from the minor leagues to the major leagues or from bench player to starter, and job security, in the form of multi-year contracts.

So, there now existed a professional ecosystem in baseball where those who were willing to skirt the rules could take a relatively quick and easy route to the level of production necessary to succeed and advance in their industry. Shortcuts that would enhance their track record and improve their chances of winning and keeping jobs and help build their professional profiles to ‘superstar’ levels, greatly increasing compensation as a result.

Is this much different than the situation for researchers in today’s academic publishing ecosystem?

Where some authors – called “parasite authors” by Dr. Serihy Kozmenko in a guest post for The Source – deliberately “seek symbiosis with predatory journals” in order to boost publication records, essentially amassing publication statistics on steroids. Other authors, those not willing to use predatory journals as a simple path to publication, must operate in the same system, but under a different set of rules that make it more difficult to generate the same levels of production. In this situation, how many authors who would normally avoid predatory journals would be drawn to them, just to keep up with those who use them to publish easily and frequently?

Is it time for asterisks on CVs?

At academic conferences, on message boards, and other forums for discussing issues in scholarly communication, a familiar refrain is that predatory journals are easy to identify and avoid, so predatory publishing, in general, is not a big problem for academic publishing. While there is some level of truth to the fact that many, though not all, predatory journals are relatively easy to spot and steer clear of, this idea denies the existence of parasite authors. These researchers are unconcerned about the quality of the journal as they are simply attempting to publish enough papers for promotion or tenure purposes.

Parasite authors are also likely to be undeterred by the fact that although many predatory journals are indexed in platforms such as Google Scholar, articles published in these journals have low visibility due to the algorithms used to rank research results in these engines. Research published in predatory journals is not easily discovered, not widely read, and not heavily cited, if at all. The work is marginalized and ultimately, the reputation of the researcher is damaged.

There are myriad reasons why an author might consider publishing in a predatory journal, some born out of desperation. The ‘publish or perish’ system places pressure on researchers in all career stages – how much blame for this should be placed on universities? In addition, researchers from the Global South are fighting an uphill battle when dealing with Western publishing institutions. Lacking the same resources, training, language skills, and overall opportunities as their Western counterparts, researchers from the developing world often see no other choice but to use predatory journals (the majority located in their part of the world) to keep pace with their fellow academics’ publishing activity.

To a large degree, Major League Baseball has been able to remove PEDs from the game, mostly due to increased random testing and more severe penalties for those testing positive. Stemming the flow of predatory publishing activity in academia will not be so straightforward. At the very least, to begin with, the scholarly community must increase monitoring and screening for predatory publishing activity (with the help of resources like Cabells’ Predatory Reports) and institute penalties for those found to have used predatory journals as publishing outlets. As in baseball, there will always be those looking to take shortcuts to success, having a system in place to protect those who do want to play by the rules should be of paramount importance.

Opening up the SDGs

While the United Nations Sustainable Development Goals (SDGs) offer a framework for global communities to tackle the world’s biggest challenges, there are still huge barriers to overcome in ensuring research follows the desired path. This week, Simon Linacre reflects on the ‘push’ and ‘pull’ effects in publishing and one organization trying to refine a fragmented infrastructure.

Recently, Cabells has been able to further its commitment to pursuing the UN SDGs by signing up to the SDG Publishers Compact and sharing details of its pilot journal rating system with the Haub School of Business at Saint Joseph’s University that assesses journals in terms of their relevance to the SDGs. Part of the reason Cabells is working with the SDGs – aside from a simple belief that they are a force for good – is that they represent an opportunity to offer reward and recognition for researchers who are using their talents to in some small way make the world a better place.

The scholarly communications industry, like many others, relies on push and pull dynamics to maintain its growth trajectory. The push elements include the availability of citations and other metrics to judge performance, recognition for publishing in certain journals, and various community rewards for well-received research. On the flip side, pull elements include opportunities shared by academic publishers, a facility to record research achievements, and an opportunity to share findings globally. This is how the publishing world turns round.

This dynamic also helps to explain why potentially disruptive developments – such as Open Access or non-peer-reviewed journals and platforms – may fail to gain initial traction, and why they may require additional support in order to become embedded with academics and their mode of operations. Going back to the SDGs, we can see how their emergence could similarly be stymied by the existing power play in scholarly publishing – where are the push and pull factors guiding researchers to focus on SDG-related subjects?

I recently spoke to Stephanie Dawson, CEO at ScienceOpen, which is a discovery platform that seeks to enable academics to enhance their research in an open access environment and offer publishers ‘context building services’ to improve the impact of their outputs. ScienceOpen is very much involved with the UN SDGs, recently creating a number of content containers for SDG-related articles. By offering curative opportunities, post-publication enhancements, and article-level data services, ScienceOpen is most definitely doing its part to support a pull strategy in the industry.

Stephanie says, “We began this project working with the University College London (UCL) Library to showcase their outputs around the UN SDGs. Because we believe there needs to be broad community buy-in, we also wanted to encourage researchers globally to highlight their contributions to the Sustainable Development Goals by adding keywords and author summaries on ScienceOpen, regardless of the journal they published in and demanding publisher engagement for new works.”

And this is what Cabells is also trying to achieve – by offering new metrics that can be used to guide authors to the optimal publishing option (push) and highlighting traditionally overlooked journals with low citations as destination publications (pull), we hope we can change the conversation from ‘Is this a good journal?’ to ‘Does this research matter?’. And we think reframing the context like ScienceOpen is doing is an important first step.

Spotlight on Turkey

Turkey has been making great strides in recent years as a force to be reckoned with on the international research stage. However, it seems to have encountered more problems than other countries with regard to predatory journals. Simon Linacre looks at the problems facing the country and highlights some resources available to help Turkish scholars.

A simple Google search of “predatory journals Turkey” provides quick insight into the concerns academic researchers there have regarding these deceptive publications. Numerous articles fill the first pages of results highlighting the particular issue Turkey seems to share with a few other countries such as India and Nigeria. Alongside, however, are anonymous websites offering unsupported claims about predatory publications. Validated information appears to be thin on the ground.

Luckily, the Turkish government understands there is a problem and in the Spring of 2019 it decided to take action. According to Professor Zafer Koçak in his article ‘Predatory Publishing and Turkey’, the Turkish Council of Higher Education decreed that “scientific papers published in predatory journals would not be taken into account in academic promotion and assignment. Thus, Turkey has taken the step of becoming one of the first countries to implement this in the world”.

According to its website, the Turkish Council of Higher Education believed the phenomenon was increasing, and was doing so internationally. A number of articles have been published recently that back this up – for example here and here – and there is the potential for Turkish authors to get caught up in this global swell due to their increasing publication output.

To support Turkish authors and institutions, Cabells has translated its information video on its Journalytics and Predatory Reports products, as well as translating this page, into Turkish. Hopefully, the availability of independently verified information on predatory journals and greater dialogue will improve the conditions for Turkey and its scholars to continue to grow their influence in global research.



Türkiye son yıllarda uluslararası araştırma sahnesinde yabana atılamayacak büyük bir aşama kaydetmektedir. Ancak yağmacı dergilerle diğer ülkelerde olduğundan daha fazla sorunlarla karşılaşıyor gibi görünüyor. Simon Linacre bu konuda ülkenin karşı karşıya olduğu sorunlara bakıyor ve Türk bilim insanlarına yardımcı olacak mevcut kaynakların altını çiziyor.

Basit bir “predatory journals Turkey” Google taraması akademik araştırmacıların bu aldatıcı yayınlarla ilgili endişelere sahip oldukları konusunda hızlı bir anlayış sağlıyor. Taramanın ilk sayfaları, Türkiye’nin bu sorunu Hindistan ve Nijerya gibi diğer bir kaç ülke ile paylaştığını gösteren sonuçlarla dolu. Fakat bu sonuçların bir kısmı da yağmacı yayınlar hakkında desteklenmeyen iddialar sunan anonim web sayfaları. Doğrulanmış ve güvenilir bilgi nadir görülüyor.

Neyse ki, Türk hükümeti bir sorun olduğunun farkında ve 2019 Baharında önlem almaya karar verdi. Profesör Zafer Koçak’ın ‘Predatory Publishing and Turkey’ makalesine göre, Yükseköğretim Kurulu tarafından alınan kararla “yağmacı dergilerde yayımlanan bilimsel makaleler akademik yükseltmelerde dikkate alınmayacak. Böylece Türkiye dünyada bu kararı yürürlüğe koyan ilk ülkelerden biri olma adımını attı”.

Yükseköğretim Kurulu web sitesine göre, Kurul hem ulusal hem de uluslararası ortamlarda yağmacı yayıncılığın arttığına inanıyor. Son zamanlarda bunu destekleyen bir çok makale yayımlandı – örneklerini burada ve burada görebilirsiniz – ve yayın sayıları ile birlikte hızla artan küresel yağmacılığa Türk yazarların yakalanma olasılığı var. Cabells, Türk yazarları ve kurumları desteklemek için Journalytics ve Predatory Reports ürünlerinin bilgilendirici videosu ile birlikte bu sayfayı da Türkçeye çevirdi. Umarız ki, yağmacı dergiler hakkında bağımsız olarak onaylanmış bilginin ulaşılabilirliği ve daha güçlü iletişim, Türkiye’nin ve  akademisyenlerinin global araştırmadaki etkilerini arttırarak devam ettirmeleri konusunda şartları iyileştirecek.

Cabells launches new SDG Impact Intensity™ journal rating system in partnership with Saint Joseph’s University’s Haub School of Business

Following hot on the heels of Cabells’ inclusion in the United Nations SDG Publishers Compact, we are also announcing an exclusive partnership with Saint Joseph’s University (SJU) for a new metric assessing journals and their engagement with the UN’s Sustainable Development Goals (SDGs). Simon Linacre explains the origins of the collaboration and how the new metric could help researchers, funders, and universities alike.

If you can remember way back to the halcyon days when we went to academic conferences, you will remember one of the many benefits we enjoyed was to meet a kindred spirit, someone who shared your thoughts and ideas and looked forward to seeing again at another event. These international friendships also had the benefit of enabling you to develop something meaningful with your work, and went some way to justifying the time and expense the trips often entailed.

I was lucky enough to have one such encounter at the GBSN annual conference in Lisbon, Portugal at the end of 2019 when I met professor David Steingard from Saint Joseph’s University in the US. He was at the event to present some of the work he had been doing at SJU on its SDG Dashboard – an interactive visualization and data analytics tool demonstrating how university programmes align with the 17 SDGs. At the gala dinner I sought Dr. Steingard out and asked him something that had been buzzing inside my head ever since I heard him speak:

What if we applied your SDG reporting methodology to journals?

An animated conversation then followed, which continued on the bus home to the hotel, at the conference the next day and ultimately to the lobby of a swanky hotel in Davos (there are no other kinds of hotels there, to be honest) a year ago. From then on, small teams at SJU and Cabells have been working on a methodology for analysing and assessing the extent to which a journal has engaged with the UN’s SDGs through the articles it has published over time. This has resulted in the new metric we are releasing shortly – SDG Impact Intensity™ – the first academic journal rating system for evaluating how journals contribute to positively impacting the SDGs.

Using data collated from Cabells’ Journalytics database and running it through SJU’s AI-based methodology for identifying SDG relevance, SDG Impact Intensityprovides a rating of up to three ‘SDG rings’ to summarise the SDG relevance of articles published in the journals over a five-year period (2016-2020). For the first pilot phase of development, we chose 50 of the most storied business and management journals used for the Financial Times Global MBA ranking as well as 50 of the most dynamic journals from Cabells’ Journalytics database focused on sustainability, ethics, public policy and environmental management.

It may come as no surprise to learn that the so-called top journals lagged way behind their counterparts when it came to their levels of SDG focus. For example, none of the top 26 journals in the pilot phase are from the FT50, and only four of the top ten are from the world’s five biggest academic publishers. In contrast, the journals traditionally ranked at the very top of management journal rankings from the past 50 years in disciplines such as marketing, accounting, finance and management languish at bottom of the pilot phase ratings. While these results are hardly surprising, it perhaps shows that while governments, funders and society as a whole have started to embrace the SDGs, this has yet to filter through to what has been published in journals traditionally regarded as high impact. There has long been criticism that such titles have been favoured by business school management structures over more innovative, real-world relevant journals, and this very much seems to be borne out by the results of Cabells’ research with SJU. The very notion of what academic journal “quality” means is fundamentally challenged in light of considering how journals can make an ”impact” through engaging the SDGs.

Cabells and SJU are hoping to further their partnership and broaden their coverage of journals to enable more researchers and other interested parties to understand the type of research their target journals are publishing. With more information and greater understanding of the SDGs at hand, it is to be hoped we see a move away from a narrow, single-focus on traditional quality metrics towards a broader encouragement of research and publication that generates a positive impact on bettering the human condition and environmentally sustaining the Earth as detailed in the SDGs. In turn, we should see academia and scholarly communications play their part in ensuring the UN’s 2030 Agenda for Sustainable Development moves forward that much quicker.

Predatory journals vs. preprints: What’s the difference?

While working towards publication in a legitimate journal, however circuitous the route, is of course a much better path than publishing in an illegitimate journal, Simon Linacre examines why this is a useful question to consider.

A blog post this week in The Geyser pointed out the problems surrounding version control of the same article on multiple preprint servers and on the F1000 platform.

TL;DR? It isn’t pretty.

The article used as an example is unquestionably a legitimate study relating to the coronavirus pandemic, and as such is a small but important piece in the jigsaw being built around science’s pandemic response. That this article has yet to be validated – and as such enabled as a piece that fits the COVID-19 jigsaw – is something that will presumably be achieved once it is published in a recognized peer-reviewed journal.

However, this does raise the following rather thorny question: how is the article any better served fragmented on different preprint servers and publishing platforms than it would be having been published as a single entity in a predatory journal?

I am being facetious here – working towards a legitimate publication, however circuitous the route, is far better than publishing in an illegitimate journal. However, comparing the two options is not as strange as one might think, and perhaps offers some guidance for authors uncertain about where to publish their research in the first place.

Firstly, early career researchers (ECRs), while often offered very little direction when it comes to publication ethics and decision-making, are understandably worried about sharing their data and findings on preprint servers for fear of being ‘scooped’ by other researchers who copy their results and get published first. This is a legitimate fear, and is one explanation why a researcher, although unfamiliar with a journal, might submit their research for a low fee and quick turnaround.

Secondly, ECRs or more experienced researchers may be incentivised by their institutions to simply achieve a publication without any checks on the type of journal they publish in. As such, they need a journal to validate their publication – even if the journal itself has not been validated – which is something preprints or non-journal platforms are unable to provide.

Finally, while recent research has shown that just over half of articles published in predatory journals do not receive any citations, just less than 50% did receive citations, and authors may prefer one sole accessible source for their research than multiple sources across different preprints. This is not to say that preprints can’t receive citations – indeed Google Scholar reveals 22 citations to the article above from its original posting on Arxiv – but the perception may be that only journals can deliver citations, and will therefore be the aim for some authors.

Of course, authors should know the very real difference between a predatory journal and a preprint, but the evidence of 14,000+ journals on Cabells Predatory Reports database and the millions of spam emails received daily from illegitimate journals points to at least some researchers falling for the same tricks and continue to line the pockets of predatory publishers. While research publishing options remain as varied and as complex as they are – and while higher education institutions and funders simply assume every researcher has an effective publishing strategy – then as many will fall into the predatory trap as they have always done.

Cabells and scite partner to bring Smart Citations to Journalytics

Cabells, a provider of key intelligence on academic journals for research professionals, and scite, a platform for discovering and evaluating scientific articles, are excited to announce the addition of scite’s Smart Citations to Cabells Journalytics publication summaries.

Journalytics summary card with scite Smart Citations data

Journalytics is a curated database of over 11,000 verified academic journals spanning 18 disciplines, developed to help researchers and institutions optimize decision-making around the publication of research. Journalytics summaries provide publication and submission information and citation-backed data and analytics for comprehensive evaluations.

scite’s Smart Citations allow researchers to see how articles have been cited by providing the context of the citation and a classification describing whether it provides supporting or disputing evidence for the cited claim.

The inclusion of Smart Citations adds a layer of perspective to Journalytics metrics and gives users a deeper understanding of journal activity by transforming citations from a mere number into contextual data.

Lacey Earle, executive director of Cabells, says, “Cabells is thrilled to partner with scite in order to help researchers evaluate scientific articles through an innovative, comparative-based metric system that encourages rigorous and in-depth research.”

Josh Nicholson, co-founder and CEO of scite says of the partnership, “We’re excited to be working with Cabells to embed our Smart Citations into their Journalytics summaries. Smart Citations help you assess the quantity of citations a journal has received as well as the quality of these citations, with a focus on identifying supporting and disputing citations in the literature.”


about cabells

Cabells generates actionable intelligence on academic journals for research professionals.  On the Journalytics platform, an independent, curated database of more than 11,000 verified scholarly journals, researchers draw from the intersection of expertise, data, and analytics to make confident decisions to better administer research. In Predatory Reports, Cabells has undertaken the most comprehensive and detailed campaign against predatory journals, currently reporting on deceptive behaviors of over 14,000 publications. By combining its efforts with those of researchers, academic publishers, industry organizations, and other service providers, Cabells works to create a safe, transparent and equitable publishing ecosystem that can nurture generations of knowledge and innovation. For more information please visit Cabells or follow us on Twitter, LinkedIn and Facebook.

about scite

scite is a Brooklyn-based startup that helps researchers better discover and evaluate scientific articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or disputing evidence. scite is used by researchers from dozens of countries and is funded in part by the National Science Foundation and the National Institute of Drug Abuse of the National Institutes of Health. For more information, please visit scite, follow us on Twitter, LinkedIn, and Facebook, and download our Chrome or Firefox plugin. For careers, please see our jobs page.