Empowering India’s Academia

According to some research, India has the unfortunate distinction of having both the highest number of predatory journals based there, and the highest number of authors publishing in them. In this week’s blog, Simon Linacre answers some of the questions researchers in that country have regarding authentic publishing practices.

During the latter part of 2020, instead of jetting off to typical destinations in the scholarly communications calendar such as Frankfurt and Charleston, some of my energies have been focused on delivering a number of short webinars on predatory publishing to a variety of Indian institutions. According to an oft-quoted article by Shen and Bjork in 2015, India has the largest number of authors who have published in predatory journals, and Cabells knows from its own data that one of the most common countries to find predatory journals originating is in India.

There are probably a number of reasons that account for this, but rather than speculate it is perhaps better to try to act and support Indian authors who do not want to fall into the numerous traps laid for them. One aspect of this are the recent activities of the University Grants Commission (UGC) and its Consortium for Academic and Research Ethics (CARE), which have produced a list of recommended journals for Indian scholars to use.

However, many in India are still getting caught out as some journals have been cloned or hijacked, while others use the predatory journal tactic of simply lying about their listing by UGC-CARE, Scopus, Web of Science or Cabells in order to attract authorship. Following one webinar last week to an Indian National Institute of Technology (NIT), there was a flurry of questions from participants that deserved more time than allowed, so I thought it would be worth sharing some of these questions and my answers here so others can hopefully pick up a few tips when they are making that crucial decision to publish their research.

  1. What’s the difference between an Impact Factor and CiteScore? Well, they both look and feel the same, but the Impact Factor (IF) records a journal’s published articles over a two year period and how they have been cited in the following year in other Web of Science-indexed journals, whereas the CiteScore records a journal’s published documents over a three year period before counting citations the following year.
  2. How do I know if an Impact Factor is fake? Until recently, this was tricky, but now Clarivate Analytics has made its IFs for the journals it indexes for the previous year available on its Master Journal List.
  3. If I publish an article in a predatory journal, can the paper be removed? A submission can be made to the publisher for an article to be retracted, however a predatory publisher is very unlikely to accede to the request and will probably just ignore it. If they do respond, they have been known to ask for more money – in addition to the APC that has already been paid – to carry out the request, effectively blackmailing the author.
  4. If I publish an article in a predatory journal, can I republish it elsewhere? Sadly, dual publication is a breach of publication ethics whether the original article is published in a predatory journal or a legitimate one. The best course of action is to build on the original article with further research so that it is substantially different from the original and publish the subsequent paper in a legitimate journal.
  5. How can I tell if a journal is from India or not? If the origin of the journal is important, the information should be available on the journal’s website. It can also be checked using other sources, such as Scimago which provides country-specific data on journals.

Simon Linacre, Cabells

The RAS Commission for Counteracting the Falsification of Scientific Research

Predatory publishing is undoubtedly a global phenomenon, but with unique characteristics in different countries. In this week’s blog, Simon Linacre shares insight from Russia and a group of researchers keen to shine the spotlight on breaches in publication ethics from within their own country.

For many of us, the Summer months are usually filled with holidays, conferences and a less than serious new agenda. The so-called ‘silly season’ could reliably be identified by news stories of the cat-stuck-in-tree variety, signaling to us all that there was nothing going on and we could safely ignore the news and concentrate on more important things, such as what cocktail to order next from the poolside bar.

Not anymore.

It is hard to put a finger on it, but since around 2015 the Summer months seem to have been populated almost exclusively by epoch-making events from which it is impossible to escape. Whether it’s Brexit, COVID or any number of natural or man-made disasters, the news cycle almost seems to go up a gear rather than start to freewheel. And news stories in scholarly communications are no different. This summer saw a number of big stories, including one in Nature Index regarding alleged plagiarism and article publications in predatory journals by Russian university officials. Intrigued, I contacted the research group behind the investigation to learn more.

The group in question is the Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences, and earlier this year they compiled what they claimed to be the first evidence of large-scale ‘translation plagiarism’ by Russian authors in English-language journals (“Predatory Journals at Scopus and WoS:  Translation Plagiarism from Russian Sources” (2020). Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences in collaboration with Anna A. Abalkina, Alexei S. Kassian, Larisa G. Melikhova). In total, the Commission said it had detected 94 predatory journals with259 articles from Russian authors, many of which were plagiarised after being translated from Russian into English.

In addition, the study saw that over 1,100 Russian authors had put their names to translated articles which were published in predatory journals. These included heads of departments at Russian universities, and in the case of three authors over 100 publications each in predatory journals. The report (the original can be found here) was authored by some of the founders of Dissernet, a Russian research project which is developing a database of mainly Russian journals which publish plagiarised articles or violate some other criteria of publication ethics. They are concerned that the existence of paper mills in Russia that spam authors and offer easy publication in journals is leading to wide-ranging breaches of publication ethics, supported by inflated metrics appearing to lend some legitimacy to them. Cabells hopes to be able to support the work of Dissernet in highlighting the problem in Russia and internationally, so watch this space.

Know before you go

Earlier this week, the Guardian in the UK released its updated university rankings, just the latest of myriad national and international exercises in defining the “best” university. At a time when deciding to go to university is fraught with unkowns, Simon Linacre argues that critical thinking skills are more important than ever.


I’ll admit it, I love it when my own university tops any kind of ranking. The fact that I was there 25 years ago and the subjects and academics taught there are unrecognisable, is of no consequence. That the department I graduated from is the BEST in the country is something I have to tell my wife and colleagues about, despite the arched eyebrows and disdain I can feel from several thousand miles away.

What does this mean? Well, aside from the fact that I must be a walking target for alumni fundraisers, it shows that my critical faculties, if not entirely absent, are certainly being overridden by shameless pride in something I have no right to be proud about. But like your favourite football team, you can’t help pulling for them through thick and thin – when they suck you say “we’re awful!”, and when they do well you say “we’re top!”.

Who’s “we”?

However, deciding which university to go to in a year or two is not the same as choosing a football team to support. You should use every critical faculty you have and hone it until it is razor-sharp before you even think of filling in a form or visiting a campus. And that means you should learn to read a university ranking like you would read a balance sheet before investing in a company, or reviewing a journal before submitting an article. I do not believe there is anything inherently wrong in any ranking as it can provide extremely useful data on which to base a decision. But you need to know where the data came from and how it relates to the investment you are making in your future life and career. This is why we always recommend users of Cabells’ Journalytics database use other relevant data points for their individual circumstances.

This week, the Guardian published its UK university rankings for 2021, with Oxford, St Andrews and Cambridge leading the way overall (full disclosure: I attended St Andrews). Each broad subject is then broken down into separate rankings with the promise that, “unlike other league tables, the Guardian rankings are designed with students in mind.” What, other university rankings do NOT have students in mind? Straight away, the amount of spin adopted here should tell you that (a) you should be careful of other hyperbolae, and (b) you should look at other league tables to see why the Guardian would say this.

And there are plenty of tables to choose from – both nationally and internationally there are dozens of such rankings, all seeking to advise students on which university to choose. Why? Because of the 48 pages of the university guide, six are adverts. Organisations publish rankings guides to sell advertising and cement their reputation as education experts, further enhancing their opportunities to sell education-related advertising in the future. Knowing why there is spin and why these guides exist in the first place should help students understand what information is in front of them and ensure a better decision-making process.

But students should also dig deep into the data. In the “Business, management and marketing” subject ranking, readers are told that, “most universities will boast of having good links with business,” that “group work is a key part of many courses” and “there will also be a practical element to assessment.” But none of these points are addressed in the rankings, which include data on criteria such as course and teaching satisfaction, spend per student and “career after 15 months.” All of this information is relevant but only some has data to back it up.

Sitting with my 12-year-old at breakfast, he looked at the page on architecture (which he has wanted to do since the age of about seven), and decided he should go to Cambridge, UCL or Bath as the top three for that subject. None of those would be a bad choice, but neither would they be an informed one.