Empowering India’s Academia

According to some research, India has the unfortunate distinction of having both the highest number of predatory journals based there, and the highest number of authors publishing in them. In this week’s blog, Simon Linacre answers some of the questions researchers in that country have regarding authentic publishing practices.

During the latter part of 2020, instead of jetting off to typical destinations in the scholarly communications calendar such as Frankfurt and Charleston, some of my energies have been focused on delivering a number of short webinars on predatory publishing to a variety of Indian institutions. According to an oft-quoted article by Shen and Bjork in 2015, India has the largest number of authors who have published in predatory journals, and Cabells knows from its own data that one of the most common countries to find predatory journals originating is in India.

There are probably a number of reasons that account for this, but rather than speculate it is perhaps better to try to act and support Indian authors who do not want to fall into the numerous traps laid for them. One aspect of this are the recent activities of the University Grants Commission (UGC) and its Consortium for Academic and Research Ethics (CARE), which have produced a list of recommended journals for Indian scholars to use.

However, many in India are still getting caught out as some journals have been cloned or hijacked, while others use the predatory journal tactic of simply lying about their listing by UGC-CARE, Scopus, Web of Science or Cabells in order to attract authorship. Following one webinar last week to an Indian National Institute of Technology (NIT), there was a flurry of questions from participants that deserved more time than allowed, so I thought it would be worth sharing some of these questions and my answers here so others can hopefully pick up a few tips when they are making that crucial decision to publish their research.

  1. What’s the difference between an Impact Factor and CiteScore? Well, they both look and feel the same, but the Impact Factor (IF) records a journal’s published articles over a two year period and how they have been cited in the following year in other Web of Science-indexed journals, whereas the CiteScore records a journal’s published documents over a three year period before counting citations the following year.
  2. How do I know if an Impact Factor is fake? Until recently, this was tricky, but now Clarivate Analytics has made its IFs for the journals it indexes for the previous year available on its Master Journal List.
  3. If I publish an article in a predatory journal, can the paper be removed? A submission can be made to the publisher for an article to be retracted, however a predatory publisher is very unlikely to accede to the request and will probably just ignore it. If they do respond, they have been known to ask for more money – in addition to the APC that has already been paid – to carry out the request, effectively blackmailing the author.
  4. If I publish an article in a predatory journal, can I republish it elsewhere? Sadly, dual publication is a breach of publication ethics whether the original article is published in a predatory journal or a legitimate one. The best course of action is to build on the original article with further research so that it is substantially different from the original and publish the subsequent paper in a legitimate journal.
  5. How can I tell if a journal is from India or not? If the origin of the journal is important, the information should be available on the journal’s website. It can also be checked using other sources, such as Scimago which provides country-specific data on journals.

Simon Linacre, Cabells

Know before you go

Earlier this week, the Guardian in the UK released its updated university rankings, just the latest of myriad national and international exercises in defining the “best” university. At a time when deciding to go to university is fraught with unkowns, Simon Linacre argues that critical thinking skills are more important than ever.


I’ll admit it, I love it when my own university tops any kind of ranking. The fact that I was there 25 years ago and the subjects and academics taught there are unrecognisable, is of no consequence. That the department I graduated from is the BEST in the country is something I have to tell my wife and colleagues about, despite the arched eyebrows and disdain I can feel from several thousand miles away.

What does this mean? Well, aside from the fact that I must be a walking target for alumni fundraisers, it shows that my critical faculties, if not entirely absent, are certainly being overridden by shameless pride in something I have no right to be proud about. But like your favourite football team, you can’t help pulling for them through thick and thin – when they suck you say “we’re awful!”, and when they do well you say “we’re top!”.

Who’s “we”?

However, deciding which university to go to in a year or two is not the same as choosing a football team to support. You should use every critical faculty you have and hone it until it is razor-sharp before you even think of filling in a form or visiting a campus. And that means you should learn to read a university ranking like you would read a balance sheet before investing in a company, or reviewing a journal before submitting an article. I do not believe there is anything inherently wrong in any ranking as it can provide extremely useful data on which to base a decision. But you need to know where the data came from and how it relates to the investment you are making in your future life and career. This is why we always recommend users of Cabells’ Journalytics database use other relevant data points for their individual circumstances.

This week, the Guardian published its UK university rankings for 2021, with Oxford, St Andrews and Cambridge leading the way overall (full disclosure: I attended St Andrews). Each broad subject is then broken down into separate rankings with the promise that, “unlike other league tables, the Guardian rankings are designed with students in mind.” What, other university rankings do NOT have students in mind? Straight away, the amount of spin adopted here should tell you that (a) you should be careful of other hyperbolae, and (b) you should look at other league tables to see why the Guardian would say this.

And there are plenty of tables to choose from – both nationally and internationally there are dozens of such rankings, all seeking to advise students on which university to choose. Why? Because of the 48 pages of the university guide, six are adverts. Organisations publish rankings guides to sell advertising and cement their reputation as education experts, further enhancing their opportunities to sell education-related advertising in the future. Knowing why there is spin and why these guides exist in the first place should help students understand what information is in front of them and ensure a better decision-making process.

But students should also dig deep into the data. In the “Business, management and marketing” subject ranking, readers are told that, “most universities will boast of having good links with business,” that “group work is a key part of many courses” and “there will also be a practical element to assessment.” But none of these points are addressed in the rankings, which include data on criteria such as course and teaching satisfaction, spend per student and “career after 15 months.” All of this information is relevant but only some has data to back it up.

Sitting with my 12-year-old at breakfast, he looked at the page on architecture (which he has wanted to do since the age of about seven), and decided he should go to Cambridge, UCL or Bath as the top three for that subject. None of those would be a bad choice, but neither would they be an informed one.

A New Year’s resolution worth keeping: Say “NO” to spam

Recent studies have shown the huge impact that spam emails from predatory journals have on academics’ workflows. Simon Linacre argues that, far from being harmless, they contribute to a wider malaise in academic life.


If I said I have a New Year’s Resolution that could save everyone who reads this blog hundreds of dollars in time and effort, as well as enrich everyone’s lives, would you be interested in joining me? There is no catch, no trick, but there is a small degree of effort involved. And it is quite simple – just open up every email unsolicited email you receive and either block it or unsubscribe. Your life will improve as a result, guaranteed.
 
But will such a straightforward, if humdrum, task really make such savings? Well, two recent studies show that the total cost to academia of spam emails is vast. Firstly, this week’s Times Higher Education (THE) reports on a new study that estimates the time wasted on opening and deleting spam emails, typically ones from predatory journals, is equal to $1.1bn – and this rises to over $2bn when all spam email is included
 
They arrive at this figure using the following methodology: take an average figure for the number of targeted spam emails academics receive each day from a number of prior studies (which is around five); estimate that each academic spends five seconds dealing with every email; assume the average academic earns $50 an hour; multiply by the number of academics in the world according to the United Nations. It may sound a bit like a back-of-a-napkin calculation, but for many academics, the number of emails and time to sift through them may seem significantly undercooked.
 
Another study published in the British Medical Journal (BMJ) looked more specifically at the impact of emails received from predatory journal publishers by career development grant awardees. This study found that academic spam emails (or ASEs) were a significant distraction for academics and that there was an urgent need to mitigate the problem. The results from a survey of grant awardees showed that almost 90% had a spam filter turned on, but around half said they received up to 10 spam emails a day, with fully 30% estimating they received between 11 and 20. 
 
Some unsolicited emails may of course be legitimate, and can be blocked, while others are a result of individuals at some stage signing up to receive emails, usually to gain access to something or when making a purchase. Emails from law-abiding sources such as these can be stopped – it just takes a little time. As can those purchases from Amazon, Gap or Ebay where we have used our work email only to suffer a permanent slew of special offers (don’t worry, we’ve all been there). In these cases, our New Year’s Resolution can indeed help cut down the time spent on email and make more time for more meaningful pursuits.
 
However, as the THE piece points out, there is very little academics can currently do to stem the tide of spam from predatory journals. All we can do is become more savvy in identifying them quickly, choose not to open them and delete straight away. And in the meantime, hope that someone invents a spam filter that genuinely screens ASEs out and doesn’t send important emails from your Dean to your ‘junk’ folder.