Predatory publishing from A to Z

During 2019, Cabells published on its Twitter feed (@CabellsPublish) at least one of its 70+ criteria for including a journal on the Cabells Journal Blacklist, generating great interest among its followers. For 2020, Simon Linacre highlights a new initiative below where Cabells will publish its A-Z of predatory publishing each week to help authors identify and police predatory publishing behavior.


This week a professor I know well approached me for some advice. He had been approached by a conference to present a plenary address on his research area but had been asked to pay the delegate fee. Something didn’t seem quite right, so knowing I had some knowledge in this area he asked me for some guidance. Having spent considerable time looking at predatory journals, it did not take long to notice signs of predatory activity: direct commissioning strategy from unknown source; website covering hundreds of conferences; conferences covering very wide subject areas; unfamiliar conference organizers; guaranteed publication in unknown journal; evidence online of other researchers questioning the conference and its organizers’ legitimacy.

Welcome to ‘C for Conference’ in Cabells’ A-Z of predatory publishing.

From Monday 17 February, Cabells will be publishing some quick hints and tips to help authors, researchers and information professionals find their way through the morass of misinformation produced by predatory publishers and conference providers. This will include links to helpful advice, as well as the established criteria Cabells uses to judge if a journal should be included in its Journal Blacklist. In addition, we will be including examples of predatory behavior from the 12,000+ journals currently listed on our database so that authors can see what predatory behavior looks like.

So, here is a sneak preview of the first entry: ‘A is for American’. The USA is a highly likely source of predatory journal activity, as the country lends credence to any claim of legitimacy a journal may adopt to hoodwink authors into submitting articles to them. In the Cabells Journal Blacklist there are over 1,900 journals that include the name ‘American’ in their titles or publisher name. In comparison, just 308 Scopus-indexed journals start with the word ‘American’. So for example, the American Journal of Social Issues and Humanities purports to be published from the USA, but this cannot be verified, and it has 11 violations of Journal Blacklist criteria, including the use of a fake ISSN number and complete lack of any editor or editorial board member listed on the journal’s website (see image).

‘A’ also stands for ‘Avoid at all costs’.

Please keep an eye out for the tweets and other blog posts related to this series, which we will use from time to time to dig deeper into understanding more about predatory journal and conference behavior.

Look before you leap!

A recent paper published in Nature has provided a tool for researchers to use to check the publication integrity of a given article. Simon Linacre looks at this welcome support for researchers, and how it raises questions about the research/publication divide.

Earlier this month, Nature published a well-received comment piece by an international group of authors entitled ‘Check for publication integrity before misconduct’ (Grey et al, 2020). The authors wanted to create a tool to enable researchers to spot potential problems with articles before they got too invested in the research, citing a number of recent examples of misconduct. The tool they came up with is a checklist called REAPPRAISED, which uses each letter to identify an area – such as plagiarism or statistics and data – that researchers should check as part of their workflow.
 
As a general rule for researchers, and as a handy mnemonic, the tool seems to work well, and undoubtedly authors using this as part of their research should avoid the potential pitfalls of using poorly researched and published work. Perhaps we at Cabells would argue that an extra ‘P’ should be added for ‘Predatory’, and the checks researchers should make to ensure the journals they are using and intend to publish in are legitimate. To do this comprehensively, we would recommend using our own criteria for the Cabells Journal Blacklist as a guide, and of course, using the database itself where possible.
 
The guidelines also raise a fundamental question for researchers and publishers alike as to where research ends and publishing starts. For many involved in academia and scholarly communications, the two worlds are inextricably linked and overlap, but are nevertheless different. Faculty members of universities do their research thing and write articles to submit to journals; publishers manage the submission process and publish the best articles for other academics to read and in turn use in their future research. 
 
Journal editors seem to sit at the nexus of these two areas as they tend to be academics themselves while working for the publisher, and as such have feet in both camps. But while they are knowledgeable about the research that has been done and may actively research themselves, as editor their role is one performed on behalf of the publisher, and ultimately decides which articles are good enough to be recorded in their publication; the proverbial gatekeeper.
 
What the REAPPRAISED tool suggests, however, is that for authors the notional research/publishing divide is not a two-stage process, but rather a continuum. Only if authors embark on research intent on fully appraising themselves of all aspects of publishing integrity can they guarantee the integrity of their own research, and in turn this includes how and where that research is published. Rather than a two-step process, authors can better ensure the quality of their research AND publications by including all publishing processes as part of their own research workflow. By doing this, and using tools such as REAPPRAISED and Cabells Journal Blacklist along the way, authors can better take control of their academic careers.


Beware of publishers bearing gifts

In the penultimate post of 2019, Simon Linacre looks at the recent publication of a new definition of predatory publishing and challenges whether such a definition is fit for purpose for those who really need it – authors


In this season of glad tidings and good cheer, it is worth reflecting that not everyone who approaches academic researchers bearing gifts are necessarily Father Christmas. Indeed, the seasonal messages popping into their inboxes at this time of year may offer opportunities to publish that seem too good to miss, but in reality, they could easily be a nightmare before Christmas.
 
Predatory publishers are the very opposite of Santa Claus. They will come into your house, eat your mince pies, but rather than leave you presents they will steal your most precious possession – your intellectual property. Publishing an article in a predatory journal could ruin an academic’s career, and it is very hard to undo once it has been done. Interestingly, one of the most popular case studies this year on COPE’s website is on what to do if you are unable to retract an article from a predatory journal in order to publish it in a legitimate one. 
 
Cabells has added over two thousand journals to its Journals Blacklist in 2019 and will reach 13,000 in total in the New Year. Identifying a predatory journal can be tricky, which is why they are often so successful in duping authors; yet defining exactly what a predatory journal is can be fraught with difficulty. In addition, some commentators do not like the term – from an academic perspective ‘predatory’ is hard to define, while others think it is too narrow. ‘Deceptive publishing’ has been put forward, but this, in turn, could be seen as too broad.
 
Cabells uses over 70 criteria to identify titles for inclusion in its Journals Blacklist and widens the net to encompass deceptive, fraudulent and/or predatory journals. Defining what characterizes these journals in just a sentence or two is hard, but this is what a group of academics has done following a meeting in Ottowa, Canada earlier in 2019 on the topic of predatory publishing. The output of this meeting was the following definition:
 
Predatory journals and publishers are entities that prioritize self-interest at the expense of scholarship and are characterized by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggressive and indiscriminate solicitation practices.” (Grudniewicz et al, 2019)
 
The definition is presented as part of a comment piece published in Nature last week and came from a consensus reached at the Ottowa meeting. It is a pity that Cabells was not invited to the event and given the opportunity to contribute. As it is, the definition and accompanying explanation has been met with puzzlement in the Twittersphere, with a number of eminent Open Access advocates saying it allows almost any publisher to be described as predatory. For it to be relevant, it will need to be adopted and used by researchers globally as a test for any journal they are thinking of submitting to. Only time will tell if this will be the case.


From all of us at Cabells, we wish everyone a joyous holiday season and a healthy New Year. Our next blog will be published on January 15, 2020.

The Journal Blacklist surpasses the 12,000 journals listed mark

Just how big a problem is predatory publishing? Simon Linacre reflects on the news this week that Cabells announced it has reached 12,000 journals on its Journal Blacklist and shares some insights into publishing’s dark side.


Predatory publishing has seen a great deal of coverage in 2019, with a variety of sting operations, opinion pieces and studies published on various aspects of the problem. It seems that while on the one side, there is no doubt that it is a problem for academia globally, on the other side there is huge debate as to the size, shape and relative seriousness of that problem.

On the first of those points, the size looks to be pretty big – Cabells announced this week that its Journal Blacklist has hit the 12,000 mark. This is less than a year since it hit 10,000, and it is now triple the size it was when it was launched in 2017. Much of this is to do with the incredibly hard work of its evaluations team, but also because there are a LOT of predatory journals out there, with the numbers increasing daily.

On the last of those points, the aftershocks of the Federal Trade Commission’s ruling against OMICS earlier this year are still being felt. While there is no sign of any contrition on the part of OMICS – or of the $50m fine being paid – the finding has garnered huge publicity and acted as a warning for some academics not to entrust their research with similar publishers. In addition, it has been reported that CrossRef has now cut OMICS membership.

However, the shape of the problem is still hard for many to grasp, and perhaps it would help to share some of the tools of the trade of deceptive publishers. Take one journal on the Cabells Journal Blacklist – the British Journal of Marketing Studies.

Cabells Blacklist Screenshot

Sounds relatively normal, right? But a number of factors relating to this journal highlight many of the problems presented by deceptive journals:

  • The title includes the word ‘British’ as a proxy for quality, however, over 600 journals include this descriptor in the Blacklist compared to just over 200 in Scopus’ entire index of over 30,000 journals
  • The journal is published by European-American Journals alongside 81 other journals – a remarkable feat considering the publisher lists a small terraced house in Gillingham as its main headquarters
  • When Cabells reviewed it for inclusion in the Blacklist, it noted among other things that:
    • It falsely claimed to be indexed in well-known databases – we know this because among these was Cabells itself
    • It uses misleading metrics, including an “APS Impact Factor” of 6.80 – no such derivation of the Web of Science metric exists, apart from on other predatory journal sites
    • There is no detailed peer review policy stated
    • There is no affiliation for the Editor, one Professor Paul Simon, and searches cannot uncover any marketing professors with such a name (or a Prof. Garfunkel, for that matter)

This IS a problem for academia because, no matter what the size and seriousness of predatory publishing may be unless researchers learn to spot the signs of what it looks like, they will continue to get drawn in and waste their research, funding dollars, and even career, on deceptive publishing practices.

FTC v. OMICS: a landmark predatory publishing case

In March of 2019, upon review of numerous allegations of predatory practices against the publisher OMICS International, the U.S. District Court for the District of Nevada ordered OMICS to pay $50.1 million in damages. The case marks one of the first judgments against a publisher accused of predatory practices and could be a signal of greater publisher oversight to come.


In March of this year, a US federal court ordered OMICS International to pay over $50 million in damages stemming from a 2016 lawsuit brought by the Federal Trade Commission (FTC), the first such action against a ‘predatory’ publisher.  The FTC was moved to act against the Hyderabad, India-based open access publisher and its owner, Srinubabu Gedela, after receiving a multitude of complaints from researchers concerning several systematic fraudulent practices.

In April we wondered if this decision would be more than a public record and condemnation of OMICS’ practices, but also act as a deterrent to other similar operations. Stewart Manley, a lecturer for the Faculty of Law at the University of Malaya, has gone deeper in examining this topic in two recent articles: “On the limitations of recent lawsuits against Sci‐Hub, OMICS, ResearchGate, and Georgia State University” (subscription required) featured in the current issue of Learned Publishing, and “Predatory Journals on Trial: Allegations, Responses, and Lessons for Scholarly Publishing from FTC v. OMICS” from the April issue of Journal of Scholarly Publishing (subscription required).

Mr. Manley was also recently interviewed for Scholastica’s blog where he addressed several key questions on this topic and felt that other questionable publishers will likely not be deterred if OMICS wins on appeal or simply refuses to comply with the order. He also lays out the key takeaways from FTC v. OMICS for publishers, academics, and universities.

Another recent article, “OMICS, Publisher of Fake Journals, Makes Cosmetic Changes to Evade Detection” by Dinesh C. Sharma for India Science Wire discusses a recent study showing the evolution of OMICS journals to mimic legitimate journals, making it difficult to distinguish between authentic and fake journals using the standard criteria. Rather than make substantive changes to their practices, OMICS is finding ways to more effectively evade quality checks.

Despite the hits OMICS has taken in actual court and in the court of public opinion, with an appeal in the offing, the final outcome of this matter is still to be determined. Additionally, as Mr. Manley points out in his Q&A, enforcing a judgment such as this is difficult, especially when the defendant is from a foreign jurisdiction. OMICS has yet to comply with the order and there is little reason to believe they ever will. We will continue to monitor this case and will provide updates as they become available.

The power of four

After hearing so many different ways that its Journal Whitelist and Journal Blacklist have been used by customers, Cabells has started to map out how any researcher can use journal data to optimize their decision-making. Fresh from its debut at the World Congress on Research Integrity in Hong Kong last month, Simon Linacre shares the thinking behind the new ‘Four Factor Framework’ and how it could be used in the future.


The 6th World Congress on Research Integrity (WCRI) was held in Hong Kong last month, bringing together the great and the good of those seeking to improve the research process globally. Topics were surprisingly wide, taking a look at such diverse areas as human rights, predatory publishing, data sharing, and policy. It was significant that while much of the focus of the conference was on the need to improve education and learning on how to conduct research ethically, many of the cases presented showed that there is still much to do in this respect.

Cabells was also there and used its presence to share some ideas on how to overcome some of these challenges, particularly with regard to engagement with improved research and publishing practices. Taking the established issues within predatory publishing encountered the world over as a starting point (i.e. choosing the wrong journal), as well as the need to look at as much data as possible (i.e. choosing the right journal), Cabells has very much put the author at the center of its thinking to develop what it has called the ‘Four Factor Framework’:

 

The framework, or FFF, firstly puts the onus on the researcher to rule out any poor, deceptive or predatory journals, using resources such as the Blacklist. This ‘negative’ first step then opens up the next stage, which is to take the four following factors into account before submitting to a research paper to a journal:

  • Strategic: understanding how a journal will impact career ambitions or community perspectives
  • Environmental: bringing in wider factors such as research impact or ethical issues
  • Political: understanding key considerations such as publishing in titles on journal lists, avoiding such lists or journals based in certain areas
  • Cultural: taking into account types of research, peer review or article form

Having talked to many customers over a period of time, these factors all become relevant to authors at some point during that crucial period when they are choosing which journal to publish in. Customers have fed back to Cabells that use of Cabells’ Whitelist and Blacklist – as well as other sources of data and guidance – can be related to as benchmarking, performance-focused or risk management. While it is good to see that the databases can help so many authors in so many different ways, judging by the evidence at WCRI there is still a huge amount to do in educating researchers to take advantage of these optimized approaches. And this will be the main aim of Cabells’ emerging strategy – to enable real impact by researchers and universities through the provision of validated information and support services around scholarly publishing.

Look to the North

Say what you like about Canada – and plenty do – but they are taking the threat of predatory publishing more seriously than just about any other country. Simon Linacre reflects on recent activities North of the 49th parallel.


It’s all about Canada at the moment. They have the new NBA champions in the shape of the Toronto Raptors, overcoming huge odds to beat defending champions Golden State Warriors in a thrilling finals series, and in the process becoming the first Canadian winner of a major US sports championship in over 25 years. Add to that one of the more likable (if under pressure) world leaders, and continued dominance of lists of best places to live, and those living North of the border seem to have it sewn up, eh?

They also seem to be leading the way when it comes to research integrity and publishing ethics, as a number of high-profile studies and articles have shown. A piece in Canadian news magazine The Walrus has highlighted the problems of predatory publishing in Canada, but also the fight to overcome these concerns. The article is entitled ‘The Rise of Junk Science’ and highlights the ways predatory publishing has infected scholarly activities in Canada, including:

  • Seeing predatory publishers buy out legitimate publishers and use their name to support predatory conferences
  • Wasting resources from funding organizations and universities on APCs for predatory journals
  • Forcing reputable universities to check every individual CV of every academic to ensure they haven’t published in predatory journals
  • Running the risk that ‘junk science’ published in seemingly authentic journals will be read and used, causing unknown harm
  • Allowing unscrupulous academics to take advantage of university policy to publish in journals with no peer review or quality control in order to be rewarded.

This last point was also highlighted by Derek Pyne in 2017, who pointed out that some of his colleagues at Thompson Rivers University had published in predatory journals and received rewards as a result. Pyne was suspended for a brief spell following publication of the article, but it acted as a wake-up call to many institutions to review their policies when it came to publications.

Canada also hosted a conference on predatory journals this year – as opposed to the numerous predatory conferences that have also sprung out of predatory publishing practices. These are also highlighted by the article in The Walrus which gives a great overview of the predatory publishing problem in Canada. Cabells’ own evidence shows that there seems to be a specific issue there as there are 119 confirmed and over 500 suspected predatory journals originating from there in the Blacklist, which is nearly 5% of the total. However, by shining a light on the problem and tackling it head-on, the country can at least lead the way for many others to follow.

Faking the truth

Predatory publishing can cause harm in all sorts of ways, but so can fighting it with the wrong ammunition. In this blog post, Simon Linacre looks at examples of how organizations have gone the wrong way about doing the right thing.


One of the perks – and also the pains – of working in marketing is that you have to spend time trawling through social media posts. It is painful because no matter how good your filters are, there is a huge amount of unnecessary, unearthly and unhealthy content being shared in absolute torrents. On the flip side, however, there are a few gems worth investigating further. Many of them prove to be rabbit holes, but nevertheless, the chase can be worthwhile.

Searching through some posts earlier this month I happened upon mention of an updated list of recommended and predatory journals. Obviously, this is our gig at Cabells so I was genuinely intrigued to find out more. It turns out that the Directorate General of Scientific Research and Technological Development (RSDT) in Algeria has produced three lists on its website – two of recommended journals for Algerian researchers in two subject categories, and one of predatory journals and publishers.

Judgment

A cursory look at the predatory list shows that the first 100 or so journals match Beall’s archived list almost exactly. Futhermore, there is nowhere on the website that suggests how and why such a list exists, other than an open warning to authors who publish in one of the journals listed:

“To this effect, any publication in a journal in category A or B which is predatory or published by a predatory publisher, or that exclusively publishes conference proceedings, is not accepted for defense of a doctoral thesis or university tenure.” (own translation)

In other words, your academic career could be in trouble if you publish in a journal in the RSDT list.

Consequences

The rights and wrongs, accuracies and inaccuracies of Beall’s list have been debated elsewhere, but it is fair to say that as Beall was trying to eradicate predatory publishing practices by highlighting them, some journals were missed while some publishers and their titles were perhaps unfairly identified as predatory. Now the list is over two years out of date, with one version being updated by no-one-knows-who. However, what are the consequences for Algerian academics – and authors from anywhere else who are judged by the same policy – of publishing in a journal?

  1. Publish in RSDT-listed journal that is not predatory but on the list: Career trouble
  2. Publish in RSDT-listed journal that is predatory and on the list: Career trouble
  3. Publish in journal not listed by RSDT but is predatory: Career trouble
  4. Publish in journal not listed by RSDT and is not predatory: Career OK

Option 4 is obviously the best option, and Option 2 is a sad result for authors not doing their homework and de-risking their publishing strategy. But it seems there will be a large number of academics who make a valid choice (1) based on independent criteria who will fall foul of an erroneous list, or who think they are safe because a journal is not on the RSDT list but is predatory (3).

Comparison

One of my colleagues at Cabells cross-referenced the RSDT list and the Cabells Blacklist, which now has over 11,595 journals reviewed and validated as predatory. The results show that due to a lack of crossover between the lists, many academics in Algeria, and potentially elsewhere, could be wrongly condemned or unwittingly publish in predatory journals:

  • In terms of publishers, the RSDT list contains 1601 unique publishers, while the Blacklist contains 443
  • There are exactly 200 publishers on both lists, meaning that around 12% of the publishers on the RSDT list are also included in the Blacklist, while 43% of the Blacklist publishers are also on the RSDT list
  • The RSDT list contains 2488 unique journals, of which only 81 are the same as the 11,500+ Blacklist journals
  • Less than 1% (0.7%) of the Blacklist is also on the RSDT list; conversely, about 3% of the RSDT list is also included on the Blacklist.

As always, the moral of the story for authors is ‘research your research’ – fully utilize the skills you have gained as an academic by applying them to researching your submission decision and checking multiple sources of information. Failure to do so could mean serious problems, wherever you are.

FTC’s victory will educate, but will it deter?

Word of the Federal Trade Commission’s $50 million court judgment against OMICS International and its owner, Srinubabu Gedela, has reached all corners of the academic community. While there is no question this is a step in the right direction, there is little reason to believe this will do much to slow the growing problem of predatory publishing.

The victory for the FTC was a decisive one, with the court granting a summary judgment – a decision without the need for a trial as no material facts were in dispute – and the message is clear: OMICS International is running a scam operation and the damage is real and impactful. If there are still those in the scholarly community who doubt the severity of the problem of predatory publishing, perhaps this judgment will convince them of the magnitude of the issue and the need to deal with it in a proactive manner.

While OMICS is the largest predatory publishing operation – there are currently 768 OMICS journals listed on Cabells’ Journal Blacklist – it is simply the most ravenous shark in a sea of predators. If/when OMICS actually halts their fraudulent operations – they are expected to appeal the decision – the vacuum created will quickly be filled by any number of bad actors looking to capitalize and snatch up the revenue that is now presumably up for grabs.

Fundamentally, there are two groups of researchers at play when it comes to predatory publishers. The first group is made up of researchers who might be considered “prey” – those who unwittingly fall victim to the ploys of predatory publishers. These researchers submit their paper for publication only to find that their work has been hijacked, part of their limited budget has been squandered on (often hidden) publication fees, and their careers have been compromised by this association with a fraudulent publisher.

The work of the FTC and news of their victory over OMICS will hopefully go far in protecting those who are unaware of the existence and deceitful nature of operations such as OMICS, and will educate them on the warning signs and help them steer clear of ever becoming involved going forward.

However, what (if any) impact this decision will have on the second group of researchers, those who knowingly use predatory publishers to advance their career or for other professional gains remains to be seen.  The reason predatory publishers have been able to flourish and grow exponentially is that there is an insatiable market for their services due in large part to the ‘publish or perish’ system forced upon academics. The publication of research papers is at an all-time high with estimates of close to two million papers published each year, with little in the way of a quality control system in place. Predatory publishers have simply identified and capitalized on an opportunity for illicit profit.

The focus on research and publication needs to be on quality and not quantity. Publication records need to be vetted and researchers held accountable for the outlets they choose for publication.  The respective bodies of knowledge for many fields are compromised and diluted by the dissemination of junk research. Unqualified candidates are getting hired, promoted and tenured on the backs of their ginned-up publication records. Predatory publishers and these researchers who support them are coming out ahead, while academia and knowledge are losing out.

Despite the decision against OMICS, certain researchers have been and will continue to look for shortcuts to publication. The onus is on administrators, department heads, funders, and academia at large to change – not just the process by which academics are measured by moving away from the “publish or perish” mindset, but also the methods used to monitor and vet research and publication activity.

The FTC’s victory may go a long way in reducing the number of researchers who can honestly say they were unaware of a problem with the journal they chose, but it will do little to stop those who are willful participants in this process without wholesale changes from other key stakeholders.

Blacklist Journals Overtake Whitelist

What’s in a number? Well, when the number of bad journals overtakes the number of good journals, we may have something to worry about. Simon Linacre takes a brief look behind the figures and shares some insight into the current dynamics of scholarly publishing.


Right up there with ‘How many grains of sand are there in the world?’, ‘Is Santa Claus real?’ and ‘Where do babies come from?’, one of the questions you do not want to be asked as a member of the scholarly communications industry is ‘How many journals are there?’. This is because, like grains of sand there is no finite answer as the numbers will change from one day to the next, but also there is no way to even approximate an educated guess. You could, perhaps, as a fall back look at the numbers of journals where someone has actually counted and updated the number. For example:

  • Cabells Journal Whitelist: 11,048
  • Clarivate Analytics Master Journal List: 11,727
  • Directory of Open Access Journals: 12,728
  • Scopus: 36,377
  • Ullrichs Periodicals Directory: 300,000+ (periodicals)

However, all of the above have criteria that either limit the number of journals they count or include most journals plus other forms of publication. And another journal list that adds further complexity is this one:

Now, the more eagle-eyed among you will have seen that the Cabells Blacklist now lists more journals than are indexed in the Whitelist. How can this be? Are we saying there are more predatory journals than legitimate titles out there? Well, not quite. While Cabells has a growing Blacklist thanks to the ever-expanding activities of predatory publishers, the Whitelist is limited to journals of evident quality according to specific criteria and is yet to include medical and engineering journals. When both databases were launched in 2017, the Whitelist was based on Cabells Directories that went back decades, while the Blacklist was newly developed with 4,000 journals. That has now grown to over 11,000 in nearly two years, with many journals coming through the pipeline for assessment.

Due to the rigorous process Cabells administers for the Whitelist, it was inevitable that such a list where many titles are rejected would be superseded by the Blacklist where sadly ever more titles are acceptable for inclusion, due to the proliferation of predatory publishing practices.

So, if you do get asked the dreaded question, the answer is that there are a LOT of journals out there. Some are good, some are bad, and some are in-between. But arm yourself with a trusted index and some relevant criteria, and you won’t need to play the numbers game.