Measuring Sustainability Research Impact

Cabells was excited and honored to have the opportunity to take part in the EduData Summit (EDS), which took place at the United Nations in New York City in June. The EDS is the “world’s premium forum for data-driven educators – a platform for strategists, data scientists, CIOs and other dataheads to discuss and share best practices at the intersection of big data, predictive analytics, learning analytics, and education.”

The theme of this year’s Summit was “The Virtuous Circle: Sustainable and inclusive life-long learning through EduData” and sessions focused on topics such as access to education, continued and distance education, innovation in data science and AI, and sustainability.

Cabells CTO Lucas Toutloff was joined by Rachel Martin, Global Sustainability Director at Elsevier, and David Steingard from Saint Joseph’s University’s Haub School of Business for the virtual presentation “Industry-University Collaboration for Impact with the UN SDGs.” The panel discussed the importance of connecting research and science to the United Nations Sustainable Development Goals (SDGs) widely, and more specifically, bridging the gap between researchers and practitioners. The SDGs are 17 interconnected goals spanning a large set of environmental, social, and economic topics and represent a universal call to action for building a more sustainable planet by 2030.  

“Industry-University Collaboration for Impact with the UN SDGs” presented at the EduData Summit at the United Nations, June 2022

Scholarly publishing can steer research and innovation toward the SDGs if we specifically and collectively shift the focus to address these crucial objectives and solutions. Researchers must lead the way by providing solutions for practitioners to put into action. Cabells, as one of the first U.S. organizations and non-primary publishers globally to be awarded membership to the SDG Publishers Compact, along with having the privilege of being part of the Compact’s Fellows Group, is fully invested in helping to leverage the power of scholarly publishing to achieve the SDGs.

The SDG Publisher’s Compact and Fellows Group

The SDG Publisher’s Compact’s core mission is to create practical and actionable recommendations for stakeholders in every corner of academic research – publishers, editors and reviewers, researchers and students, authors and librarians – for how they can have the SDGs at the forefront of their research agenda so we can collectively bridge the gap between researchers and practice.

The goal of the Compact Fellows Group is to encourage all areas of the ecosystem to share in incentivizing researchers to perform work that supports and addresses the SDG and help smooth the transition from research to practice. The Fellows Group has created specific best practices and recommendations for each sector that can be acted upon immediately to drive research into the hands of practitioners. The goal is to incentivize research that is driving innovation to address the SDGs which means we need to have ways to parse through, discover, and measure this research, because “what gets measured gets done.”

A major component in this process is establishing a broad spectrum of reporting and insights to drive incentives and measures of impactful research to gauge how an institution, individual researcher, or journal is performing in terms of SDGs. SDG Publisher’s Compact members have a responsibility to drive research to action and impact and devise ways to measure its effectiveness, reward those who conduct and publish impactful research in impactful journals, and continue to encourage those who don’t.

The SDG Impact Intensity Journal Rating

Toward this end, and in the spirit of SDG 17 “Partnerships for the Goals,” we are working with SJU on a publisher-neutral, AI-driven academic journals rating system assessing scholarly impact on the SDGs, called the SDG Impact IntensityTM (SDGII) journal rating. Data, scholarship, and science will be the driving forces for meeting the 2030 goal and as SDG research output is increasing, funders, universities, and commercial and not-for-profit organizations need to know money, time, and research is being well spent and having an impact.

We have discussed (here and here) our commitment to doing our part to advance progress on meeting the SDGs and, ultimately, the 2030 Agenda. Our work with Professor Steingard and his team from SJU in developing the SDGII to help business schools determine the impact their research is having on society by addressing global crises has been some of our most rewarding work. Working within the business school ecosystem, we’re examining how the SDGs can inspire a transformation from quality to impact in business by looking at journals in terms of their alignment and taxonomy connection to the SDGs.

The top 50 business school journals (according to Financial Times in 2016) were examined by the United Nations PRME group and it was discovered that only 2.8% of articles published in ‘top-tier’ journals address challenges such as poverty, climate change, clean energy, water, equality, etc. This is a problem that continues today, many of the same journals are still among the top in business journal rankings and they are not championing and featuring impactful research to any meaningful and impactful degree.

Cabells and SJU are trying to address this problem through the SDGII by shifting the philosophy on what “counts” when looking at business journals and noting which publications are driving impact with respect to the SDGs. We are working to integrate, promote, and ultimately change the benchmarks of what matters in academic output and the data that drives decision-making.

To continue to promote this initiative and encourage the shift from quality to impact, we were thrilled to have the opportunity to discuss our progress at the AACSB’s International Conference and Annual Meeting (ICAM), in April in New Orleans.

Sustainability is the crisis of our generation, and sustainability‑mindedness has been an important point in academic research. The SDGII is designed to give stakeholders on every level the ability to measure what they’re doing and to serve as a cross‑motivational tool to drive the industry forward on issues of sustainability. As mentioned earlier, when it comes to incentives, what gets measured gets done. The traditional metrics of evaluating the quality of research journals focus mainly on citation intensity which evaluates journals based on how much they are used and cited. While this makes sense on some level, research must be read to have an impact after all, it’s missing the mark by not considering, and measuring, impact on SDGs.

The SDGII is an alternative, complementary metric that will evaluate a journals SDG research and output through artificial intelligence and machine learning and build a profile for the publication to demonstrate its impact on these issues. Rather than throw out the traditional approach of evaluating quality and value of a journal, we are seeking to build on the foundation that good journals have in terms of things like scholarly rigor, audience, citations, and rankings. We want to move the needle to highlight research and journals that address the SDGs and the SDGII will help business schools demonstrate how their research is achieving societal impact and meeting the Global Goals.

Academic Sleuthing

With plenty of advice and guidance on the internet on how to identify and avoid predatory journals, many argue the game is up. However, Simon Linacre argues that while so many authors and journals slip through the net, numerous skills are required to avoid the pitfalls, not the least of which is, as one case study shows, being an amateur sleuth….


Back in the day when I used to lecture researchers on optimizing their publishing strategy, I always used to use the refrain ‘Research your research’ to underline the importance of utilizing the investigative skills of academic research for the purpose of understanding scholarly communications. Knowledge is power, as the saying goes, and knowing how the medium of academic publishing works can enable effective and robust decision-making, especially in academia where those decisions can have a long-term bearing on careers. Knowing the best journals to publish in can prove to be a huge benefit to any given academic.

Turns out knowing where NOT to publish can also have the same benefits.

This notion was underlined to Cabells this month when an academic publications advisor highlighted a case they had been involved in at their university. The advisor – whose identity and that of the institution has been anonymized at their request – was based at a research institute and among other duties advised its researchers about submissions to academic journals, including such things as copyediting, publishing licenses, and open access payments.

Recently, one of the institute’s academics had been invited to present at a conference in 2022, but the invitation was brought to the advisor’s attention as it was a little outside their normal sphere of activity. The advisor thought the invite and presentation were a bit unprofessional and advised against accepting the invitation. Upon further investigation, they found the conference was linked to a suspected predatory publisher, which had been highlighted online in several different sources.

However, the advisor was still not satisfied as while there were suggested links and implications, there was also some evidence of legitimate activities and details. It was only when the advisor scrutinized some of the journals’ articles that she found further evidence of fake journals and scientific anomalies and requested confirmation of their suspicions. We were glad to confirm that the publisher in question – Knowledge Enterprises Inc. (KEI) – indeed looked suspicious and had six journals included in our Predatory Reports database [see image below for example].

Predatory Reports entry for Journal of Economics and Banking from KEI Journals

The moral of this story is not just that ‘researching your research’ can help identify bad actors. It also shows that persistence with an investigation and a wide range of inputs from different sources are required to support ethical publication practices. In some cases, nothing less will do.

No Hiding from Predatory Menace

If you thought predatory publishing had had its day and things were improving, there is bad news. As Simon Linacre reports, there is even more bad news and surely worse behavior to follow in the coming weeks and months. However, the InterAcademy Partnership’s ongoing project studying predatory journals and conferences aims to educate researchers on how to identify and avoid these dangers.


As we wind down to the end of what has been yet another tumultuous year and some of us look forward to the various holidays and celebrations that lie ahead, it is common for us to reflect on what has gone before and look forward to what a new year may bring. Particularly against the backdrop of the global pandemic and dangers of climate change, it would be comforting to look at some issues that may be close to being resolved or at least lessened in their negative impact.

Just don’t look to the fight against predatory publishing activities for any relief.

In a year that has seen Cabells’ Predatory Reports database pass 15,000 journals, a major study has started to release its findings from its global investigation into predatory journals and conferences. The InterAcademy Partnership (IAP) is an international network of scientific academies collaborating on issues to provide trustworthy advice and guidance. In the last 18 months it has tackled predatory activities as one of its projects, releasing some initial findings earlier this year. These showed that from a survey of over 1,800 academics on 112 countries:

  • nearly a quarter had either published in a predatory journal, participated in a predatory conference, or didn’t know if they had
  • over 80% thought predatory practices were on the rise or a serious problem in their country of work
  • over half thought that such practices widened the research gap between high income and low income countries.

The full report is currently at the peer review stage and due for release early in 2022, with a research article also in the works. IAP believe that one of the main things it has learned from the study is that researchers in all countries, at all stages of their career and in any discipline can be vulnerable to predatory practices, and as a result raising awareness is now a vital mission for IAP.  In this vein, it has announced it is running four regional webinars through its IAP Regional Networks and one global webinar with The World Academy of Sciences (TWAS), both with the Global Young Academy, focused mainly on the research community.  

Tickets for the free webinars are now available and open to everyone. If we can’t end the year on a high note with respect to predatory journals, at least we can try and ensure ourselves and our networks are as aware as possible of this dark phenomenon.

OA Week: Open Spectrum

This week sees the 14th Open Access Week (#OAWeek #OAWeek2021) since it started in 2008. To mark the event, Simon Linacre looks at the challenges and opportunities the movement may face in post-pandemic times.



For many in the scholarly communications industry, Open Access Week is a fixture on the calendar just as much as Frankfurt Book Fair and The Charleston Conference, which bookend OA week. So, it may surprise people to learn that it only started as ‘Open Access Day’ in October 2008 as a follow up to the National Day of Action for Open Access in February 2007, growing to a week’s worth of activity in 2009. OA has come a long way since then – but how far does it still have to go?

Open Access content was minimal in those days, with an estimated 8.5% of published articles available as OA in 2008, and a further 11.9% available in repositories. By 2020, several estimates put the total number of research articles available via some form of OA as well over half of all articles published.

Judging the success of this growth since the inception of OA Week is difficult, and it probably depends where you are on the spectrum of opinion on OA itself. If you strongly believe that all research should be freely available period, then there is probably some frustration that a significant slice of content is still behind a paywall. The growth of OA as a percentage of all content has been sustained and consistent but is unlikely to reach the vast majority of published articles for some time yet. However, this availability varies hugely in terms of geography, with some countries such as the UK having national mandates in place to ensure almost all newly published articles are Open Access.

If you are on the other side of the spectrum and have no problem with the traditional subscription model, then you may be surprised how developed OA has become. So-called transformative agreements, initiatives such as Plan S and the increased use of repositories for scholarly communications have all contributed to the tide turning in favor of OA.

And if you are on this side of the spectrum, then you may also have concerns about the decrease in use of peer review as a method of validating research. The COVID-19 pandemic has both highlighted the risks of research being shared without peer review checks, and also stressed the importance of the sharing of vital medical research as quickly as possible. The net result is probably an acceleration, both of the availability OA research and worries about the consequences of this.

But where does this acceleration lead to? It was inevitable that most research would become available as OA, and if funding – either for authors or for publishers – was available to cover the costs of that, then few would disagree with this outcome. But for many it was not about when most research would be made OA, but how that would happen, and for them the validation of research in an age of fake news and deep fake images is perhaps more important than ever.


New Kid on the Block

The publishing industry is often derided for its lack of innovation. However, as Simon Linacre argues, there is often innovation going on right under our noses where the radical nature of changes are yet to be fully understood, for good or bad.



There is a clock ticking down on the top right of my screen. I have 15 mins and 28 seconds to upgrade to premium membership for half price. But what do I need to do? What is the clock for? What happens if I don’t click the button to stop the clock in time…?

This isn’t an excerpt from a new pulp fiction thriller, but one of the elements in a new journal many academics will have received notification of recently. Academia Letters is a new journal from Academia.edu, a social networking platform for researchers worldwide to share and discover research. Since it started life in 2008, the site has become popular with academics, with millions signed up to take advantage of the platform to promote their research and find others to collaborate with. It has also been controversial, accused of hosting thousands of article pdfs in breach of copyright terms. Up until now, the site has focused on enabling researchers to share their work, but now they have joined the publishing game with their own journal publishing short articles of between 800 and 1,600 words.

The new offering provides several other different takes on the traditional journal:

  • All articles are Open Access but for a lower fee than average (£300 in the UK)
  • Peer review times are promised to be “lightning-fast”
  • Articles are accepted or rejected at the first round, with only minor revisions required if accepted.

Now, some people reading this will ask themselves: “Doesn’t that sound like a predatory journal?”. However, it is very clear that Academia Letters is categorically NOT predatory in nature, because far from attempting to deceive authors into believing there is in-depth peer review, it is clear that the light-touch process and access to millions of users should mean the publishing process is both fast and cheap compared to other OA options. However, the quality of articles would not be expected to match those in a traditional journal given the brevity and lack of intervention from peer reviewers in the new model.

It will be interesting to see how many authors take advantage of the new approach chosen by the journal. If it takes off, it could open up other new forms from traditional publishers and other networking sites, and be held up as a clear example of innovation in scholarly communications. However, the journal may run afoul of its approach to marketing as authors have become increasingly wary of promises of fast turnaround times and low APCs from predatory publishers. For example, what happened when the ticking clock ran down to signify the end of a half price deal to become a premium member of Academia.edu? It simply reset to 48 hours for the same deal. Such marketing tactics may be effective in signing some authors up, but others may well be put off, however innovative the new proposition might be.

A New Perspective

What should a good quality journal include in its make-up – rigorous research, well-regarded editorial board, plenty of citations? But what if we challenge these assumptions and demand commitment to the UN’s Sustainable Development Goals as well? There are solutions to this challenge, and here Simon Linacre introduces the first SDG Impact Intensity™ rating from Cabells and Saint Joseph’s University.


It is said that some of the best deals are done in a secluded restaurant or in the back of a cab. For academics, perhaps the equivalent is the fringes of a conference gala dinner and in a coach back to the hotel. That’s what happened when I met Dr. David Steingard from Saint Joseph’s University (SJU) in Lisbon in late 2019, where we discussed what an appraisal of journals from the perspective of the UN’s Sustainable Development Goals (SDGs) might look like.

First trialed in March of this year, the fruits of this meeting are released today in the shape of the SDG Impact Intensity™ journal rating. This pilot study – the first full ratings are expected in early 2022 – seeks to highlight the differences between business and management journals regarded as leaders in their disciplines, and those which have focused on sustainability and related issues. The pilot consists of 100 journals rated according to their relevance – or intensity – to the UN’s 17 SDGs, determined by the relative focus they have exhibited in their article publications over the last five years. Using a sophisticated AI methodology from SJU and journals based on Cabells’ Journalytics database, journals were rated from zero to five, with six journals achieving the top rating.

Traditionally, citations and rankings have been a proxy for quality, none more so than the list of 50 journals used by the Financial Times for its FT Research rankings. However, to what extent have these journals started to reflect on research on climate change and the SDGs in recent years – a focus which should surely be a top priority for business and business schools alike?

The evidence from the SDG Impact Intensity™ journal rating is that… there has been very little focus at all. As you can see from the list of 100 journals, only two journals from the FT 50 appear in the top 50 of the list, showcasing the fact – as if there was any doubt – that sustainability journals that have typically lagged behind top business journals in terms of citations and prestige far outperform them when it comes to engagement with the SDGs and the research agenda they represent. We will view with interest the FT’s plan for a “slow hackathon” this Autumn as part of a review of their journal list.

Cabells started to investigate this area to see if there was another way to assess what value journals represented to authors looking to publish their work. What the last two years have shown is that more than a shift in perspective, there is a paradigm shift waiting to happen as the value of journals to authors moves from old-fashioned prestige to a more dynamic representation of mission-driven research. While Cabells and some publishers have backed this general shift by signing up to initiatives such as the UN Publishers Compact, much more can be done to progress the impact agenda in scholarly communications. Events such as the upcoming Higher Education Sustainability Initiative (HESI) Webinar aim to tackle the problem of aligning research programs and outcomes in publications head on. By highlighting those journals that are already focused on this alignment – and those that could do better – Cabells and SJU are hoping they can play a part in genuinely moving the dial.

Mountain to climb

As the return to university beckons for many of us, we are unfortunately reminded that many of the challenges facing scholarly communications persist. Simon Linacre assesses wider issues impacting on publication ethics as Cabells’ Predatory Reports database hits the 15,000 journal mark.


Last month saw two landmarks in my working life of the sort that makes you sit back and reflect on what you’re doing and why you’re doing it. The first was my three-year anniversary of starting work at Cabells, which have been three of the most rewarding years I have spent in my career in scholarly publishing. The second was Cabells’ Predatory Reports database reaching a total of 15,000 journals – 15,059 at the time of this post to be precise – pushed to that level by a recent surge in positive identifications of predatory journals.

What links these two milestones personally, is that the Predatory Reports database hit the 10,000 journal mark just after I started work for Cabells, and one of my first tasks in my new role was to write a press release detailing the news for interested parties (a press release for the new milestone can be accessed here). At the time, it was mind-boggling for me to think that the problem had grown so big, and I wondered how many more journals would be discovered. Would the database reach 11,000 or 12,000 journals? Would the rate of increase level off or decline? In fact, the rate of increase has been maintained, with around 150 titles being added on average by Cabells’ journal audit team every month.

While the rate of increase has been steady, it has been interspersed with sharp gains when a new publisher is uncovered and its numerous cut-and-paste journals included. As we saw in this blog post in July where almost a third of the journals added were from a single publisher, new entrants to the market (or existing operators with new identities), are still driving up numbers and as a result making it harder for researchers to find legitimate outlets for their papers to be published.

One look at some recent stories in the higher education press point to a wider malaise for academics when it comes to publication ethics more generally. There has been a spate of stories where publishers have had to retract articles from their journals because of evidence they were from paper mills, increased scrutiny of data manipulation, and concerns over gift, ghost and fake authorship.

Luckily for authors, if the threats over publication ethics have never been greater, the solutions to this problem also seem to be proliferating. In addition to databases of information such as Cabells’ Predatory Reports that can aid decision-making for academics, there are many online courses now available, as wells as new studies into how to train academics effectively in publication ethics issues. So while the numbers of predatory journals and size of the publication ethics problem seems to be increasing, the tools to deal with these challenges at least seem to be keeping pace – which is the good news we need as we head back to school.

There was an attempt to hijack a journal…

As our journal investigation team members work their way around the expanding universe of scholarly publications, one of the more brazen and egregious predatory publishing scams they encounter is the hijacked, or cloned, journal.  One recent case of this scheme uncovered by our team, while frustrating in its flagrance, also offered some levity by way of its ineptness. But make no mistake, hijacked journals are one of the more nefarious and injurious operations carried out by predatory publishers. They cause extensive damage not just to the legitimate journal that has had its name and brand stolen, but to research and society as a whole, as noted recently in this piece from Retraction Watch on the hundreds of papers from hijacked journals found in the WHO COVID-19 library.

There are a few different variations on the hijacked journal, but all include a counterfeit operation stealing the title, ISSN and/or domain name of a legitimate journal to create a duplicate, fraudulent version of the same. They do this to lure unsuspecting (or not) researchers into submitting their manuscripts (on any topic, not just those covered by the original, legitimate publication) for promises of rapid publication for a fee.

The most recent case of journal hijacking investigated by our team involved the hijacking of this legitimate journal, Tierärztliche Praxis, a veterinary journal out of Germany with two series, one for small and one for large animal practitioners:

The website for the legitimate journal, Tierärztliche Praxis

by this counterfeit operation, using the same name:

The website for the hijacked version of Tierärztliche Praxis

One of the more immediate problems caused by cloned journals is how difficult they make it for scholars to discover and engage with the legitimate journal, as shown in the image below of Google search results for “Tierärztliche Praxis.” The first several search results refer to the fake journal, including the top result which links to the fake journal homepage.

“Tierärztliche praxis” translates to “veterinary practice” in English, and the original journal is of course aimed at veterinary practitioners. Not so for the fake Tierärztliche Praxis “journal” which is aimed (sloppily) at anyone writing about anything who is willing to pay to have their article published:

The hijacked journal’s aim & scope: to sum up – they’ll accept any paper, on any topic

Aside from a few of the more obvious signs of deception found with the cloned journal: a poor website with duplicate text and poor grammar, an overly simple submission process, an incredibly wide range of topics covered, to name a few, this journal’s “archive” of (stolen) articles takes things to a new level.

The original article, stolen from Tuexenia vs. the hijacked version

A few things to note:

  • The stolen article shown in the pictures above is not even from the original journal that is being hijacked, but from a completely different journal, Tuexenia.
  • The white rectangle near the top left of the page to cover the original journal’s title and the poorly superimposed hijacked journal title and ISSN at the header of the pages, and the volume information and page number in the footer (without even bothering to redact the original article page numbers).
  • The FINGER at the bottom left of just about every other page of this stolen article.

Sadly, not all hijacked or otherwise predatory journals are this easy to spot. Scholars must be hyper-vigilant when it comes to selecting a publication to which they submit their work. Refer to Cabells Predatory Reports criteria to become familiar with the tactics used by predatory publishers. Look at journal websites with a critical eye and be mindful of some of the more obvious red flags such as promises of fast publication, no information on the peer review process, dead links or poor grammar on the website, or pictures (with or without fingers) of obviously altered articles in the journal archives.

Predatory Reports listing for the hijacked version of Tierärztliche Praxis

What lies beneath

The first set of data from Cabells’ collaboration with Inera’s Edifix shows that nearly 300 article checks included references to predatory journals. Simon Linacre looks behind the data to share more details about ‘citation contamination.’


A few months ago, Cabells announced a trial partnership with the Edifix service, an article checking tool from Wiley’s Inera division (watch the free webinar discussing the collaboration from SSP’s OnDemand Library). Subscribers to Edifix can check their article’s references against Cabells’ Predatory Reports database for free during an open beta phase, and the first results of this offer have been announced by Edifix on their latest blog. The results show that:

  • A total of 295 jobs have had at least one reference flagged as having been included in a journal that is currently listed by Cabells’ Predatory Reports since May 2021
  • When you look at all 295 of those jobs, there were 66 (22%) that also included multiple references from predatory journals
  • Over the same period, Edifix processed a total of 7102 jobs (containing 104,140 submitted references, of which Edifix was able to fully process 89,180), so overall around 4% of all live jobs included at least one reference flagged by Cabells’ Predatory Reports database.

To recap, it is in the interests of all stakeholders in scholarly communications – authors, universities, societies, funders, and society as a whole – that research is not lost to predatory publishing activities. The Edifix and Cabells collaboration is designed not only to offer access to a database such as Predatory Reports to help all these stakeholders, but to augment their capabilities to produce the best research.

In addition, the collaboration represents a step forward in preventing ‘citation contamination’, where articles published in predatory journals find their way into legitimate journals by being referenced by them directly. The new service allows users to vet references for citations to predatory journals, as identified by Predatory Reports, and reduce the contamination of the scholarly record.

It is important to underline that while checking references won’t remove the predatory journal publications in the first place, it will ensure that those articles are cited less, and also that the research they include is checked. Authors cite articles assuming what is included in them has been peer reviewed, which is the very thing that is most unlikely to happen with a predatory journal. If an author understands the work they are citing may not have had any peer review – or a sub-standard or superficial one – they can find other literature to support their case. The analogy of contamination is a strong one as not only does it conjure up the stench many feel predatory publishing practices represents, it also describes how the problem can ‘cross-contaminate’ other journals and research projects. By empowering authors to clean up their research, and highlighting the problem of contamination more widely, it is hoped that this early experiment can lead to more steps forward in the fight against predatory publishing.

No signs of slowing

Cabells adds journals to its Predatory Reports database continuously, with over 10,000 added to the original 4,000 it launched with in 2017. But can we learn anything from the journals that have been added recently? To find out, Simon Linacre takes a look at the predatory journals listed throughout June 2021.


Fancy reading up on some research to learn about COVID-19? A lot of people will have been doing the same thing over the last 18 months as they try and figure out for themselves what on earth has been happening. They will do some Google searches and read articles published in journals like the British Medical Journal, New England Journal of Medicine and the Open Journal of Epidemiology. Sure, the third one doesn’t sound quite as prestigious as the other two, but it has a bunch of articles on epidemics, and it’s all free to read – so that’s good, right?

Sadly, it’s not so good. The Open Journal of Epidemiology is one of 99 journals that Cabells listed in its Predatory Reports database last month, and is published by a well-known predatory publisher known as SCIRP (Scientific Research Publishing) based in China. The journal – not to be confused with the British Open Journal of Epidemiology or the American Open Epidemiology Journal, both in Predatory Reports as well – has dubious publication practices such as falsely claiming indexation in well-known databases, promising unusually quick peer review and publishing authors several times in the same journal and/or issue.

The journal’s search function points a handful articles relating to ‘COVID’, including one on ex-patients and their recovery which has been downloaded 200 times and viewed nearly 600 times according to the website. But we know that this journal was unlikely to have been given a full peer review, if any at all, and the data on the website is difficult to trust – the Open Journal of Epidemiology was just one of 26 journals from the same publisher which Cabells listed last month.

In total there were eight publishers who had journals listed in June, with the biggest being Bilingual Published Co. based in Singapore, with 30 journals in total. Other publishers had fewer journals listed and were based in several different countries – India, Hong Kong, Kenya and even Belgium – and it is worth pointing out that Cabells reviews each journal independently rather than the publisher as a whole.

What else can we glean from this selection of predatory titles? Just 11 out of 99 had no ISSN, further underlining the folly of using the existence of an ISSN as evidence of legitimacy. On average the journals were four-to-five years old, so reasonably well established, and predominantly based in STEM research areas. Of the 99 journals listed, just 13 were in non-STEM areas such as Education and Management. The most common subject was Medicine with no fewer than 38 journals represented. However, it is worth pointing out that many predatory publishers are either hopelessly generic, or will publish anything even if the article has nothing to do with the core topics of the journal.

Cabells is being kept rather busy by reviewing all these journals, but if you do spot a suspicious journal or receive those annoying spam emails, do let us know at journals@cabells.com and we can perform a review so that others won’t be deceived or fall into the numerous traps being laid for them.