Cabells and Inera present free webinar: Flagging Predatory Journals to Fight “Citation Contamination”

Cabells and Inera are excited to co-sponsor the free on-demand webinar “Flagging Predatory Journals to Fight ‘Citation Contamination'” now available to stream via SSP OnDemand. Originally designed as a sponsored session for the 2020 SSP Annual Meeting, this webinar is presented by Kathleen Berryman of Cabells and Liz Blake of Inera, with assistance from Bruce Rosenblum and Sylvia Izzo Hunter, also from Inera.

The webinar outlines an innovative collaborative solution to the problem of “citation contamination”—citations to content published in predatory journals with a variety of bad publication practices, such as invented editorial boards and lack of peer review, hiding in plain sight in authors’ bibliographies.

DON’T MISS IT! On Thursday, November 12, 2020, at 11:00 am Eastern / 8:00 am Pacific, Kathleen and Liz will host a live screening with real-time chat followed by a Q&A!

For relevant background reading on this topic, we recommend these Scholarly Kitchen posts:

The A-Z’s of predatory publishing

Earlier this year Cabells (@CabellsPublish) published an A-Z list of issues regarding predatory publishing practices, with one Tweet a week going through the entire alphabet. In this week’s blog, Simon Linacre republishes all 26 tweets in one place as a primer on how to successfully deal with the phenomenon

A = American. US probable source of #PredatoryJournals activity as it lends credence to claims of legitimacy a journal may adopt to hoodwink authors into submitting articles #PredatoryAlphabet #PredatoryJournal #PublicationEthics

B = British. Questionable journals often use ‘British’ as a proxy for quality. Over 600 include this descriptor in the Predatory Reports database, many more than in the Journalytics database of recommended journals

C = Conferences. Predatory conferences bear the same hallmarks as #PredatoryJournals – few named academics involved, too-good-to-be-true fees & poorly worded/designed websites. Some tips here.

D = Dual Publication. Publishing in #PredatoryJournals can lead to dual publication if the same article is then published in a legitimate journal

E = Editor (or lack of). Always check if a journal has an Editor, with an institutional email address and corroborated affiliation. Lack of an Editor or their details can indicate poor quality or predatory nature

F = Font. Look at fonts used by journals in emails or on web pages – clashing colors, outsized lettering or mixed up fonts may indicate predatory behavior

G = Germany, which takes #PredatoryJournals seriously through university-level checks, highlighting the issue and exposing problems in a 2018 investigation

H = H-Index. Authors’ and journals’ #H_index scores can be skewed by #PredatoryJournal activity

I = ISSN. Over 4,000 out of 13,000 (30%) journals on @CabellsPublish Predatory Reports database claim an ISSN (some genuine, some fake)

J = JIF. Always check the Journal Impact Factor (JIF) claims on journal websites as many #PredatoryJournals falsely claim them. Check via @clarivate Master Journal List

K = Knowledge. Spotting #PredatoryJournals and recognising legitimate #Journals can be important as subject knowledge for publication. ‘Research your research’ by using tools such as @CabellsPublish databases

L = Legitimacy. First responsibility for authors is to check legitimacy of journals – use checklists, peer recommendations and @CabellsPublish Predatory Reports database

M = Membership scams. Beware any journal that offers membership of a group as a condition of publication. Check existence of group and journal credentials on @CabellsPublish Journal databases

N = Nature. Using a trusted scholarly brand such as @nature can help identify, understand and define #PredatoryJournals, with dozens of articles on the subject via @NatureNews

O = OMICS. In 2019, the FTC fined publisher OMICS over $50m for deceptive publishing practices

P = Predatory Reports. The new name for the @CabellsPublish database of 13,400 #PredatoryJournals

Q = Quick publication. Peer review is a long process  typically lasting months. Beware journals offering quick #PeerReview and publication in days or even weeks

R = Research. Academic #Research should also include research on #Journals to enable #PublicationEthics and #ResearchIntegrity are followed. Use @CabellsPublish Predatory Reports to support your research

S = Spam. Journal emails to solicit articles are often predatory (see below for ironic example). Authors can check the legitimacy of journals by using the @CabellsPublish Predatory Reports database

T = Top Tips. Use our research on #PredatoryJournals and how to spot #PredatoryPublishing practices

U = Unknown. Never trust your research to any unknown editor, journal or publisher #PredatoryJournals #PredatoryAlphabet via @CabellsPublish

V = Vocabulary. Look out for ‘unusual’ vocabulary in predatory journal solicitation emails (i.e., “…your precious paper…”) #PredatoryJournals #PredatoryAlphabet via
@CabellsPublish

W = Website. Look for red flags on a journal’s website such as dead links, poor grammar/spelling, no address for journal or publisher, or no contact info (just web-form)

X = There are no current ISSNs with more than one ‘X’ at the end, which serves as a check digit. Be aware of fake ISSNs with multiple X’s

Y = Yosemite Sam. Always check the backgrounds of Editors and Editorial Board members of journals – if they are a cartoon character, maybe don’t submit your article

Z = Zoology. The US Open Zoology Journal is one of 35 zoology titles on the @CabellsPublish Predatory Reports database. It is not open, is not from the US and has no articles, but will accept your $300 to submit an article

Open with purpose

This week is Open Access Week, which you will not have missed due to the slew of Twitter activity, press releases and thought pieces being published – unless you are an author, perhaps. In this week’s blog, Simon Linacre focuses on academic researchers who can often be overlooked by the OA conversation, despite the fact they should be the focus of the discussion.

The other day, I was talking to my 16 year-old son about university, as he has started to think about what he might study and where he might like to go (“Dunno” and “dunno” are currently his favourite subjects and destinations). In order to spark some interest in the thought of higher education, I told him about how great the freedom was to be away, the relaxed lifestyle, and the need to be responsible for your own actions, such as handing in your work on time, even if you had to pull an all-nighter.

“What do you mean ‘hand in your work’?”, he said.

“You know, put my essay in my tutor’s pigeon hole”, I said.

“Why didn’t you just email it? And what do pigeons have to do with it?”, he replied.

Yes, university in the early 90s was a very different beast than today, and I decided to leave pigeons out of the ensuing discussion, but it highlighted to me that while a non-digital university experience is now just a crusty anecdote for students in education today, the transition from the 80s and 90s to the present state of affairs is the norm for those teaching in today’s universities. And in addition, many of the activities and habits that established themselves 20 to 30 years ago and beyond are still in existence, albeit changed to adapt with new technology.

One of these activities that has changed, but remained the same, is of course academic publishing. In the eyes of many people, publishing now will differ incredibly to what it was in the 80s pre-internet – physical vs digital, delayed vs instant, subscription vs open. But while the remnants of the older forms of publishing remain in the shape of page numbers or journal issues, there are still shadows from the introduction of in the early 2000s. This was brought home to me in some webinars recently in Turkey, Ukraine and India (reported here) where the one common question about predatory journals was: “Are all open access journals predatory?”

To those of us who have worked in publishing or to Western academics, this may seem a naïve question. But it is not. Open Access articles – that is, an article which is both free to read on the internet and free to re-use – are still relatively unknown by many academics around the world. In addition, being asked to pay money to publish is still not the norm – most journals listed by the Directory of Open Access Journals do not charge an Article Processing Charge (APC) – and publisher marketing communications are dominated by spam emails from predatory journals rather than press releases during Open Access Week. As such, while the dial may have moved appreciably in Europe and North America following initiatives such as Plan S and high-profile standoffs such as that between the University of California and Elsevier, discussion about OA may not have been replicated elsewhere.

So, while there will be many interesting conversations about Open Access (and delta Think has some fascinating data here), it is also important not to forget many authors may be hearing about it for the first time, or previously may have only heard negative or misleading information. Thankfully, there are plenty of useful resources out there, such as this introduction from Charlesworth Author Services to help authors identify the right OA outlet for their research. And of course, authors should remember that most Open Access journals are not predatory – but to be on the safe side, they can check our Predatory Reports database or criteria to judge for themselves.

Empowering India’s Academia

According to some research, India has the unfortunate distinction of having both the highest number of predatory journals based there, and the highest number of authors publishing in them. In this week’s blog, Simon Linacre answers some of the questions researchers in that country have regarding authentic publishing practices.

During the latter part of 2020, instead of jetting off to typical destinations in the scholarly communications calendar such as Frankfurt and Charleston, some of my energies have been focused on delivering a number of short webinars on predatory publishing to a variety of Indian institutions. According to an oft-quoted article by Shen and Bjork in 2015, India has the largest number of authors who have published in predatory journals, and Cabells knows from its own data that one of the most common countries to find predatory journals originating is in India.

There are probably a number of reasons that account for this, but rather than speculate it is perhaps better to try to act and support Indian authors who do not want to fall into the numerous traps laid for them. One aspect of this are the recent activities of the University Grants Commission (UGC) and its Consortium for Academic and Research Ethics (CARE), which have produced a list of recommended journals for Indian scholars to use.

However, many in India are still getting caught out as some journals have been cloned or hijacked, while others use the predatory journal tactic of simply lying about their listing by UGC-CARE, Scopus, Web of Science or Cabells in order to attract authorship. Following one webinar last week to an Indian National Institute of Technology (NIT), there was a flurry of questions from participants that deserved more time than allowed, so I thought it would be worth sharing some of these questions and my answers here so others can hopefully pick up a few tips when they are making that crucial decision to publish their research.

  1. What’s the difference between an Impact Factor and CiteScore? Well, they both look and feel the same, but the Impact Factor (IF) records a journal’s published articles over a two year period and how they have been cited in the following year in other Web of Science-indexed journals, whereas the CiteScore records a journal’s published documents over a three year period before counting citations the following year.
  2. How do I know if an Impact Factor is fake? Until recently, this was tricky, but now Clarivate Analytics has made its IFs for the journals it indexes for the previous year available on its Master Journal List.
  3. If I publish an article in a predatory journal, can the paper be removed? A submission can be made to the publisher for an article to be retracted, however a predatory publisher is very unlikely to accede to the request and will probably just ignore it. If they do respond, they have been known to ask for more money – in addition to the APC that has already been paid – to carry out the request, effectively blackmailing the author.
  4. If I publish an article in a predatory journal, can I republish it elsewhere? Sadly, dual publication is a breach of publication ethics whether the original article is published in a predatory journal or a legitimate one. The best course of action is to build on the original article with further research so that it is substantially different from the original and publish the subsequent paper in a legitimate journal.
  5. How can I tell if a journal is from India or not? If the origin of the journal is important, the information should be available on the journal’s website. It can also be checked using other sources, such as Scimago which provides country-specific data on journals.

Simon Linacre, Cabells

The RAS Commission for Counteracting the Falsification of Scientific Research

Predatory publishing is undoubtedly a global phenomenon, but with unique characteristics in different countries. In this week’s blog, Simon Linacre shares insight from Russia and a group of researchers keen to shine the spotlight on breaches in publication ethics from within their own country.

For many of us, the Summer months are usually filled with holidays, conferences and a less than serious new agenda. The so-called ‘silly season’ could reliably be identified by news stories of the cat-stuck-in-tree variety, signaling to us all that there was nothing going on and we could safely ignore the news and concentrate on more important things, such as what cocktail to order next from the poolside bar.

Not anymore.

It is hard to put a finger on it, but since around 2015 the Summer months seem to have been populated almost exclusively by epoch-making events from which it is impossible to escape. Whether it’s Brexit, COVID or any number of natural or man-made disasters, the news cycle almost seems to go up a gear rather than start to freewheel. And news stories in scholarly communications are no different. This summer saw a number of big stories, including one in Nature Index regarding alleged plagiarism and article publications in predatory journals by Russian university officials. Intrigued, I contacted the research group behind the investigation to learn more.

The group in question is the Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences, and earlier this year they compiled what they claimed to be the first evidence of large-scale ‘translation plagiarism’ by Russian authors in English-language journals (“Predatory Journals at Scopus and WoS:  Translation Plagiarism from Russian Sources” (2020). Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences in collaboration with Anna A. Abalkina, Alexei S. Kassian, Larisa G. Melikhova). In total, the Commission said it had detected 94 predatory journals with259 articles from Russian authors, many of which were plagiarised after being translated from Russian into English.

In addition, the study saw that over 1,100 Russian authors had put their names to translated articles which were published in predatory journals. These included heads of departments at Russian universities, and in the case of three authors over 100 publications each in predatory journals. The report (the original can be found here) was authored by some of the founders of Dissernet, a Russian research project which is developing a database of mainly Russian journals which publish plagiarised articles or violate some other criteria of publication ethics. They are concerned that the existence of paper mills in Russia that spam authors and offer easy publication in journals is leading to wide-ranging breaches of publication ethics, supported by inflated metrics appearing to lend some legitimacy to them. Cabells hopes to be able to support the work of Dissernet in highlighting the problem in Russia and internationally, so watch this space.

How do you know you can trust a journal?

As many readers know, this week is Peer Review Week, the annual opportunity for those involved in scholarly communication and research to celebrate and learn about all aspects of peer review. As part of this conversation, Simon Linacre reflects on this year’s theme of ‘Trust in Peer Review’ in terms of the important role of peer review in the validation of scholarship, and dangers of predatory behaviour in its absence.


I was asked to deliver a webinar recently to a community of scholars in Eastern Europe and, as always with webinars, I was very worried about the Q&A section at the end. When you deliver a talk in person, you can tell by looking at the crowd what is likely to happen at the end of the presentation and can prepare yourself. A quiet group of people means you may have to ask yourself some pretty tough questions, as no one will put their hand up at the end to ask you anything; a rowdy crowd is likely to throw anything and everything at you. With a webinar, there are no cues, and as such, it can be particularly nerve-shredding.

With the webinar in question, I waited a while for a question and was starting to prepare my quiet crowd response, when a single question popped up in the chat box:

How do you know you can trust a journal?

As with all the best questions, this floored me for a while. How do you know? The usual things flashed across my mind: reputation, whether it’s published known scholars in its field, whether it is indexed by Cabells or other databases, etc. But suddenly the word trust felt a lot more personal than simply a tick box exercise to confirm a journal’s standing. That may confirm it is trustworthy but is that the same as the feeling an individual has when they really trust something or someone?

The issue of trust is often the unsaid part of the global debates that are raging currently, whether it is responses to the coronavirus epidemic, climate change or democracy. Politicians, as always, want the people to trust them; but increasingly their actions seem to be making that trust harder and harder. As I write, the UK put its two top scientists in front of the cameras to give a grave warning about COVID-19 and a second wave of cases. The fact there was no senior politician to join them was highly symbolic.

It is with this background that the choice of the theme Trust in Peer Review is an appropriate one for Peer Review Week (full disclosure: I have recently joined one of the PRW committees to support the initiative). There is a huge groundswell of support by publishers, editors and academics to support both the effectiveness of peer review and the unsung heroes who do the job for little recognition or reward. The absence of which would have profound implications for research and society as a whole.

Which brings me to the answer to the question posed above, which is to ask the opposite: how do you know when you cannot trust a journal? This is easier to answer as you can point to all those characteristics and behaviours that you would want in a journal. We see on a daily basis with our work on Predatory Reports how the absence of crucial aspects of a journal’s workings can cause huge problems for authors. No listed editor, a fake editorial board, a borrowed ISSN, a hijacked journal identity, a made-up impact factor, and – above all – false promises of a robust peer review process. Trust in peer review may require some research on the part of the author in terms of checking the background of the journal, its publisher and its editors, and it may require you to contact the editor, editorial board members or published authors to get personal advice on publishing in that journal. But doing that work in the first place and receiving personal recommendations will build trust in peer review for any authors who have doubts – and collectively for all members of the academic community.

Special report: Assessing journal quality and legitimacy

Earlier this year Cabells engaged CIBER Research (http://ciber-research.eu/) to support its product and marketing development work. Today, in collaboration with CIBER, Simon Linacre looks at the findings and implications for scholarly communications globally.


In recent months the UK-based publishing research body CIBER has been working with Cabells to better understand the academic publishing environment both specifically in terms of Medical research publications, and more broadly with regard to the continuing problems posed by predatory journals. While the research was commissioned privately by Cabells, it was always with the understanding that much of the findings could be shared openly to enable a better understanding of these two key areas.

The report — Assessing Journal Quality and Legitimacy: An Investigation into the Experience and Views of Researchers and Intermediaries – with special reference to the Health Sector and Predatory Publishinghas been shared today on CIBER’s website and the following briefly summarizes the key findings following six months’ worth of research:

  • The team at CIBER Research was asked to investigate how researchers in the health domain went about selecting journals to publish their papers, what tools they used to help them, and what their perceptions of new scholarly communications trends were, especially in regard to predatory journals. Through a mixture of questionnaire surveys and qualitative interviews with over 500 researchers and ‘intermediaries’ (i.e. librarians and research managers), research pointed to a high degree of self-sufficiency among researchers regarding journal selection
  • While researchers tended to use tools such as information databases to aid their decision-making, intermediaries focused on sharing their own experiences and providing education and training solutions to researchers. Overall, it was notable how much of a mismatch there was between what researchers said and what intermediaries did or believed
  • The existence of so-called ‘whitelists’ were common on a national and institutional level, as were the emergence of ‘greylists’ of journals to be wary of, however, there seemed to be no list of recommended journals in Medical research areas
  • In China, alongside its huge growth in research and publication output are concerns that predatory publishing could have an impact, with one participant stating that, “More attention is being paid to the potential for predatory publishing and this includes the emergence of Blacklists and Whitelists, which are government-sponsored. However, there is not just one there are many 10 or 20 or 50 different (white)lists in place”
  • In India, the explosion of predatory publishing is perhaps the consequence of educational and research expansion and the absence of infrastructure capacity to deal with it. An additional factor could be a lack of significant impetus at a local level to establish new journals, unlike in countries such as Brazil, however, universities are not legally able to establish new titles themselves. As a result, an immature market has attempted to develop new journals to satisfy scholars’ needs which in turn has led to the rise of predatory publishing in the country
  • Predatory publishing practices seemed to be having an increased impact on mainstream publishing activities globally, with grave risk of “potentially polluting repositories and citation indexes but there seems to have been little follow through by anyone.” National bodies, publishers and funders have failed to follow through on the threat and how it may have diverted funds away from legitimate publications to those engaged in illicit activities
  • Overall, predatory publishing is being driven by publish-or-perish scenarios, particularly with early career researchers (ECRs) where authors are unaware of predatory publishers in general, or of the identity of a specific journal. However, a cynical manipulation of such journals as outlets for publications is also suspected.

 

blog image 2
‘Why do you think researchers publish in predatory journals’

 


CIBER Research is an independent group of senior academic researchers from around the world, who specialize in scholarly communications and publish widely on the topic. Their most recent projects have included studies of early career researchers, digital libraries, academic reputation and trustworthiness.

 

They’re not doctors, but they play them on TV

Recently, while conducting investigations of suspected predatory journals, our team came across a lively candidate. At first, as is often the case, the journal in question seemed to look the part of a legitimate publication. However, after taking a closer look and reading through one of the journal’s articles (“Structural and functional brain differences in key opinion journal leaders“) it became clear that all was not as it seemed.

Neurology and Neurological Sciences: Open Access, from MedDocs Publishers, avoids a few of the more obvious red flags that indicate deceitful practices, even to neophyte researchers, but lurking just below the surface are several clear behavioral indicators common to predatory publications.

1a

With a submission date of August 22, 2018, and a publication date November 13, 2018, the timeline suggests that some sort of peer review of this article may have been carried out. A closer examination of the content makes it evident that little to no peer review actually took place. The first tip-off was the double-take inducing line in the “Material and methods” section, “To avoid gender bias, we recruited only males.” Wait, what? That’s not how that works.

It soon became clear to our team that even a rudimentary peer review process (or perhaps two minutes on Google) would have led to this article’s immediate rejection. While predatory journals are no laughing matter, especially when it comes to medical research in the time of a worldwide pandemic, it is hard not to get a chuckle from some of the “easter eggs” found within articles intended to expose predatory journals. Some of our favorites from this article:

  • Frasier Crane, a listed author, is the name of the psychiatrist from the popular sitcoms Cheers and Frasier
  • Another author, Alfred Bellow, is the name of the NASA psychiatrist from the TV show I Dream of Jeannie
  • Marvin Monroe is the counselor from The Simpsons
  • Katrina Cornwell is a therapist turned Starfleet officer on Star Trek: Discovery
  • Faber University is the name of the school in Animal House (Faber College in the film)
  • Orbison University, which also doesn’t exist, is likely a tribute to the late, great musician Roy Orbison

And, perhaps our favorite find and one we almost missed:

  • In the “Acknowledgments” section the authors thank “Prof Joseph Davola for his advice and assistance.” This is quite likely an homage to the Seinfeld character “Crazy Joe Davola.”

Though our team had a few laughs with this investigation, they were not long-lived as this is yet another illustration of the thousands (Predatory Reports currently lists well over 13,000 titles) of journals such as this one in operation. Outlets that publish almost (or literally) anything, usually for a fee, with no peer review or other oversight in place and with no consideration of the detrimental effect it may have on science and research.

MedDocs PR card
Predatory Reports listing for Neurology and Neurological Sciences: Open Access

A more nuanced issue that deceptive publications create involves citations. If this was legitimate research, the included citations would not ‘count’ or be picked up anywhere since this journal is not indexed in any citation databases. Furthermore, any citation in a predatory journal that cites a legitimate journal is ‘wasted’ as the legitimate journal cannot count or use that citation appropriately as a foundation for its legitimacy. However, these citations could be counted via Google Scholar, although (thankfully) this journal has zero. Citation ‘leakage’ can also occur, where a legitimate journal’s articles cite predatory journals, effectively ‘leaking’ those citations out of the illegitimate scholarly publishing sphere into legitimate areas. These practices can have the effect of skewing citation metrics which are measures often relied upon (sometimes exclusively, often too heavily) to gauge the legitimacy and impact of academic journals.

When all is said and done, as this “study” concludes, “the importance of carefully selecting journals when considering the submission of manuscripts,” cannot be overstated. While there is some debate around the use of “sting” articles such as this one to expose predatory publications, not having them exposed at all is far more dangerous.

Right path, wrong journey

In his latest post, Simon Linacre reviews the book, The Business of Scholarly Publishing: Managing in Turbulent Timesby Albert N. Greco, Professor of Marketing at Fordham University’s Gabelli School of Business, recently published by Oxford University Press.


Given the current backdrop for all industries, one might say that scholarly communications is in more turmoil than most. With the threat to the commercial model of subscriptions posed by increasing use of Open Access options by authors, as well as the depressed book market and recent closures of university presses, the last thing anyone needs in this particular industry is the increased uncertainty brought about by the coronavirus epidemic.

As such, a book looking back at where the scholarly communications industry has come from and an appraisal of where it is now and how it should pivot to remain relevant in the future would seem like a worthwhile enterprise. Just such a book, The Business of Scholarly Publishing: Managing in Turbulent Times, has recently been written by Albert N. Greco, a U.S. professor of marketing who aims to “turn a critical eye to the product, price, placement, promotion, and costs of scholarly books and journals with a primary emphasis on the trajectory over the last ten years.”

However, in addition to this critical eye, the book needs a more practical look at how the industry has been shaken up in the last 25 years or so. It is difficult to imagine either an experienced academic librarian or industry professional advised on the direction of the book, as it has a real blind spot when it comes to some of the major issues impacting the industry today.

The first of these historical misses is a failure to mention Robert Maxwell and his acquisition of Pergamon Press in the early 1950s. Over the next two decades the books and journals publisher saw huge increases in revenues and volumes of titles, establishing a business model of rapid growth using high year-on-year price increases for must-have titles that many argue persists to this day.

The second blind spot is around Open Access (OA). This subject is covered, although not in the detail one would like given its importance to the journal publishing industry in 2020. While one cannot blame the author for missing the evolving story around Plan S, Big Deal cancellations and other OA-related stories, one might expect more background on exactly how OA started life, what the first OA journals were, the variety of declarations around the turn of the Millennium, and how technology enabled OA to become the dominant paradigm in subject areas.

This misstep may be due to the overall slight bias towards books we find in the text, and indeed the emerging issues around OA books are well covered. There are also extremely comprehensive deep dives into publishing finances and trends since 200 that mean that the book does provide a worthy companion to any academic study of publishing from 2000 to 2016.

And this brings us to the third missing element, which is the lack of appreciation of new entrants and new forms in scholarly publishing. For example, there is no mention of F1000 and post-publication peer review, little on the establishment of preprint servers or institutional repositories, and nothing on OA-only publishers such as Frontiers and Hindawi.

As a result, the book is simply a (very) academic study of some publishing trends in the 2000s and 2010s, and like much academic research is both redundant and irrelevant for those practicing in the industry. This is typified in a promising final chapter that seeks to offer “new business strategies in scholarly publishing” by suggesting that short scholarly books, and data and library publishing programs should be examined, without acknowledging that all of these already exist.


The Business of Scholarly Publishing: Managing in Turbulent Times, by Albert N. Greco  (published April 28, 2020, OUP USA) ISBN: 978-0190626235.

The scientific predator has evolved – here’s how you can fight back

Today’s post was written by Simon Linacre, Director of International Marketing and Development at Cabells, and Irfan Syed, Senior Writer and Editor at Editage Insights.


How do you identify a predatory journal? Easy, look up your spam folder – say seasoned researchers.

Actually, this ‘initial indicator’ is often the key to identifying a predatory journal. Predatory publishers send researchers frequent emails soliciting manuscripts and promising acceptance – messages that, thanks to the email service provider’s parameters, usually go straight to junk mail. Some cleverly disguised ones do make it to the inbox though, and sometimes, unwary researchers click one of these mails, unleashing the predator and an all-too-familiar sequence of events: Researcher sends manuscript. Receives quick acceptance often without a peer review. Signs off copyright. Receives a staggeringly large invoice. Is unable to pay. Asks to withdraw. Receives equally heavy withdrawal invoice – and threats. The cycle continues, the publisher getting incrementally coercive, the researcher increasingly frustrated.

What makes a predator

The term predatory journal was coined by Jeffrey Beall, former Scholarly Initiatives Librarian at the University of Denver, Colorado, in 2010, when he launched his eponymous list (now archived) of fake scientific journals, with an aim to educate the scientific community. The term was supposed to mirror the guile of carnivores in the wild: targeting the weak, launching a surprise ambush, and effecting a merciless finish.

A more academic definition might be: “Entities that prioritize self-interest at the expense of scholarship and are characterized by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggressive and indiscriminate solicitation practices.” In other words, journals that put commerce before science.

Dubious scientific journals have existed since the 1980s. They were born to lend an easy passage out of the arduous road to acceptance laid by top-rung journals. Recently, they have received a boost from the rise of the open access (OA) movement, which seeks to shift the balance of power towards the researcher. However, with revenues now accruing from the author side, new researchers pressured by a ‘publish or perish’ culture have proved easy targets for predatory publishers that exploit the OA publishing model.

The new face of predation

Today, academia faces another threat, a new predator in scientific communications – predatory author services. The dangers of using predatory author services can be just as acute as those of predatory journals. Authors who pay for such services are risking the abuse of any funding they have received by in turn funding potentially criminal activity. Such predatory author services may not be equipped to make quality edits to an author’s paper – incorrect edits, changes in the author’s intended meaning, and unidentified errors may adversely affect the author’s manuscript. Many authors choose such services to improve their articles and increase their chances of acceptance in high-quality journals, but they are very likely to be disappointed in light of the quality of services they receive.

So, the issue of predatory author services is just as problematic as it is with predatory journals. Despite the efforts of industry bodies such as COPE, it seems there are new players entering the market and even advertising on social media platforms such as Facebook. More worryingly, examples of these predatory services seem to include a veneer of sophistication hitherto not seen before, including well-designed websites, live online chat features, and direct calling.

Spotting a predatory author service

The good news is that these services bear many of the traits of predatory journals, and can be identified with a little background research. Here are some tips on how to separate predatory author services from professional operations such as Cactus’ Editage:

  • Check the English: For a legitimate journal to have spelling or grammar errors on its site or published articles would be a heinous crime, but this should go double for an author services provider. So, beyond the slick graphics and smiling model faces, check if everything is as it should appear by a thorough check of the English
  • Click the links: Dead links, links that loop back to the homepage, or links that don’t match the text should further raise your suspicion
  • Research the partnerships: If a provider genuinely works with Web of Science, Scopus, and The Lancet, there should be evidence of that rather mere logos copied and pasted onto the homepage. Search online for these publicized partnerships to know if they are genuine
  • Look up the provenance: Many predatory operators leave no address at all. Some though will choose to include a fake address (which turns out to be a long-abandoned dry-cleaning store on a deserted high street or a legitimate address that’s also home to 1,847 other registered companies). A quick search on Google Maps will show whether the address does map
  • Run if you spot a ghost: The surest giveaway of a predatory author service is the offering of ghostwriting as a service. Ghost authorship, the act of someone else authoring your entire manuscript, is a violation of research integrity. And when even ghostwriting doesn’t suffice, these services are happy to plagiarize another author’s work and pass it off as the client’s own
  • Ask your peers: Before deciding to use a service, double-check any testimonials on the provider’s homepage or ask around in your peer network.

Taking on the predator – collectively and individually

The scientific predator will no doubt continue to evolve, getting more sophisticated with time. Ultimately, all anyone can do to eradicate predatory author services or journals is to increase awareness among authors and provide resources to help them identify such predators. Cabells, Cactus, and many other industry players continually work to provide this guidance, but a good deal of the burden of responsibility has to be shared by academic researchers themselves. As the Romans might have said, caveat scriptor – author beware!

For any authoring service ad or mail you come across, look it up. Search on the net, ask your fellow researchers, pose a query in a researcher forum, go through recommended journal indices of quality and predatory publications such as those of Cabells. If it’s genuine, it will show up in several searches – and you will live to publish another day.

For further help and support in choosing the right journal or author services, go to: www.cabells.com or www.editage.com.