On the day that the US says goodbye to its controversial President, we cannot bid farewell to one of his lasting achievements, which is to highlight issues of fake news and misinformation. Simon Linacre looks at how putting the issue in the spotlight could at least increase people’s awareness… and asks for readers’ help to do so.
Cabells completed around a dozen webinars with Indian universities towards the end of 2020 in order to share some of our knowledge of predatory publishing, and also learn from librarians, faculty members and students what their experiences were. Studies have shown that India has both the highest number of predatory journals based there and most authors publishing in them, as well as a government as committed as any to dealing with the problem, so any insight from the region is extremely valuable.
Q&A sessions following the webinars were especially rich, with a huge range of queries and concerns raised. One specific query raised a number of issues: how can researchers know if the index a journal says it is listed in is legitimate or not? As some people will be aware, one of the tricks of the trade for predatory publishers is to promote indices their journals are listed in, which can come in several types:
Pure lies: These are journals that say they have an ‘Impact Factor’, but are not listed by Clarivate Analytics in its Master Journal List of titles indexed on Web of Science (and therefore have an Impact Factor unless only recently accepted)
Creative lies: These journals say they are listed by an index, which is true, but the index is little more than a list of journals which say they are listed by the index, with the addition of the words ‘Impact Factor’ to make it sound better (eg. ‘Global Impact Factor’ , ‘Scholarly Article Impact Factor’)
Nonsensical lies: These are links (or usually just images) to seemingly random words or universities that try to import some semblance of recognition, but mean nothing. For example, it may be a name of a list, service or institution, but a quick search elicits nothing relating those names with the journal
White lies: One of the most common, many predatory journals say they are ‘listed’ or ‘indexed’ by Google Scholar. While it is true to say these journals can be discovered by Google Scholar, they are not listed or indexed for the simple reason that GS is not a list or an index
When Jeffrey Beall was active, he included a list of ‘Misleading Metrics’ on his blog that highlighted some of these issues. A version or versions of this can still be found today, but are not linked to here because (a) they are out of date by at least four years, and (b) the term ‘misleading’ is, well, misleading as few of the indexes include metrics in the first place, and the metrics may not be the major problem with the index. However, this information is very valuable, and as such Cabells has begun its own research program to create an objective, independently verifiable and freely available list of fake indexes in 2021. And, what’s more, we need your help – if anyone would like to suggest we look into a suspicious looking journal index, please write to me at email@example.com and we will review the site for inclusion.
As we enter what is an uncertain 2021 for many both personally and professionally, it is worth perhaps taking the opportunity to reset and refocus on what matters most to us. In his latest blog post, Simon Linacre reflects on Cabells’ new video and how it endeavors to show what makes us tick.
It is one of the ironies of modern life that we seem to take comfort in ‘doomscrolling’, that addictive pastime of flicking through Twitter on other social media on the hunt for the next scandal to inflame our ire. Whether it is Brexit, the coronavirus epidemic or alleged election shenanigans, we can’t seem to get enough of the tolls of doom ringing out in our collective echo chambers. As the New Year dawns with little good news to cheer us, we may as well go all in as the world goes to hell in a handcart.
Of course, we also like the lighter moments that social media provide, such as cat videos and epic fails. And it is comforting to hear some stories that renew our faith in humanity. One parent on Twitter remarked this week as the UK’s schools closed and reverted to online learning, that she was so proud of her child who, on hearing the news, immediately started blowing up an exercise ball with the resolve not to waste the opportunity lockdown provided of getting fit.
Reminding ourselves that the glass can be at least half full even if it looks completely empty is definitely a worthwhile exercise, even if it feels like the effort of constantly refilling it is totally overwhelming. At Cabells, our source of optimism has recently come from the launch of our new video. The aim of the video is to go back to basics and explain what Cabells does, why it does it, and how it does it through its two main products – Journalytics and Predatory Reports.
Making the video was a lot of fun, on what was a beautiful sunny Spring day in Edinburgh with one of my US colleagues at an academic conference (remember them?). While nerve-shredding and embarrassing, it was also good to go back to basics and underline why Cabells exists and what we hope to achieve through all the work we do auditing thousands of journals every year.
It also acted as a reminder that there is much to look forward to in 2021 that will keep our glasses at least half full for most of the time. Cabells will launch its new Medical journal database early this year, which will see over 5,000 Medical journals indexed alongside the 11,000 journals indexed in Journalytics. And we also have major upgrades and enhancements planned for both Journalytics and Predatory Reports databases that will help researchers, librarians and funders better analyse journal publishing activities. So, let’s raise a (half full) glass to the New Year, and focus on the light at the end of the tunnel and not the darkness that seems to surround us in early January.
In his last blog post in what has been a tumultuous year, Simon Linacre looks forward to a more enlightened 2021 and a new era of open collaboration and information sharing in scholarly communications and higher education.
In a year with so many monumental events, it is perhaps pointless to try and review what has happened. Everyone has lived every moment with such intensity – whether it be through 24-hour news coverage, non-stop social media or simply living life under lockdown – that it seems simply too exhausting to live through it all again. So, let’s fast forward to 2021 instead.
While some of the major concerns from 2020 will no doubt remain well into the New Year, they will also fade away gradually and be replaced by new things that will demand our attention. Difficult as it may seem now, neither Trump, Brexit (for the Brits) nor COVID will have quite the hold on the news agenda as they did, and that means there is an opportunity at least for some more positive news to start to dominate the headlines.
One activity that may succeed in this respect is the open science agenda. With a new budget agreed upon by the European Research Council and a new administration in Washington DC, together with an increasing focus more generally on open science and collaboration, it is to be hoped that there will be enough funding in place to support it. If the recent successes behind the COVID-19 vaccines show anything it is surely that focused, fast, mission-driven research can produce life-changing impacts for a huge number of people. As others have queried, what might happen if the same approach was adopted and supported for tackling climate change?
In the same vein, information sharing and data analysis should also come further to the fore in 2021. While in some quarters, consolidation and strategic partnerships will bring organisations together, in others the importance of data analysis will only become more essential in enabling evidence-based decision-making and creating competitive advantages.
There have also been policy changes in China during 2020 which have meant less reliance on journals with Impact Factors, and more of a push to incentivise publications in high quality local journals. As such, the ACJR should provide a valuable guide to business school authors in China about some of the top journals available to them. The journals themselves were firstly identified using a number of established Chinese sources, as well as input from esteemed scholars and deans of top business schools. Recommended journals were then checked using Google Scholar to ensure they had published consistently over the last five years and attracted high levels of citations.
The new list is very much intended to be an introduction to Chinese-language journals in business and management, and we would very much welcome input from people on the list so we can develop it further for a second iteration in 2021.
There are many ways to skin a cat, and many ways to infer a journal could be predatory. In his latest blog post, Simon Linacre looks at the role the International Standard Serial Number, or ISSN, can play in the production of predatory journals.
For many reasons, the year 2020 will be remembered for the sheer volume of numbers that have invaded our consciousness. Some of these are big numbers – 80 million votes for Joe Biden, four million cases of COVID in the US in 2020 – and some of these will be small, such as the number of countries (1) leaving the EU at the end of the year. Wherever we look, we see numbers of varying degrees of import at seemingly every turn.
While numbers have been previously regarded as gospel, however, data has joined news and UFO sightings (seemingly one of the few phenomena NOT to increase in 2020) as something to be suspicious about or faked in some way. And one piece of data trusted by many authors in determining the validity or otherwise of a journal is the International Standard Serial Number, or ISSN.
An ISSN can be obtained relatively easily via either a national or international office as long as a journal can be identified as an existing publication. As the ISSN’s own website states, an ISSN is “a digital code without any intrinsic meaning” and does not include any information about the contents of that publication. Perhaps most importantly, an ISSN “does not guarantee the quality or the validity of the contents”. This perhaps goes some way to explain why predatory journals can often include an ISSN on their websites. Indeed, more than 40% of the journals included in Cabells’ Predatory Reports database include an ISSN in their journal information.
But sometimes predatory publishers can’t obtain an ISSN – or at least can’t be bothered to – and will fake the ISSN code. Of the 6,000 or so journals with an ISSN in Predatory Reports, 288 or nearly 5% have a fake ISSN, and this is included as one of the database’s behavioural indicators to help identify predatory activity. It is instructive to look at these fake ISSNs to see the lengths predatory publishers will go to in order to achieve some semblance of credibility in their site presence.
For some journals, it is obvious that the ISSN is fake as it looks wrong. In the example above for the Journal of Advanced Statistics and Probability, the familiar two groups of four digits followed by a hyphen format is missing, replaced by nine digits and a forward slash, which is incorrect.
More worrying are the few publications that have hijacked existing, legitimate journals and appropriated their identity, including the ISSN. In the example below, the Wulfenia Journal has had its identity hijacked, with the fake journal website pictured below.
If you compare it to the genuine journal shown below (the German homepage can be found here), you can see they list the same ISSN.
One can only imagine the chaos caused for a legitimate journal when its identity is hijacked, and this is just part of wider concerns on the effects of fake information being shared have on society. As always, arming yourself with the right information – and taking a critical approach to any information directed your way – will help see you through the morass of misinformation we seem to be bombarded with in the online world.
This week we are pleased to feature a guest post from Dr. Salim Moussa, Assistant Professor of Marketing at ISEAH at the University of Gafsa in Tunisia. Dr. Moussa has recently published insightful research on the impact predatory journals have had on the discipline of marketing and, together with Cabells’ Simon Linacre, has some cautionary words for his fellow researchers in that area.
Academic journals are important to marketing scholars for two main reasons: (a) journals are the primary medium through which they transmit/receive scholarly knowledge; and (b) tenure, promotion, and grant decisions depend mostly on the journals in which they have published. Selecting the right journal to which one would like to submit a manuscript is thus a crucial decision. Furthermore, the overabundance of academic marketing journals -and the increasing “Publish or Perish” pressure – makes this decision even more difficult.
The “market” of marketing journals is extremely broad, with Cabells’ Journalytics indexing 965 publication venues that that are associated with “marketing” in their aims and scope. While monitoring the market of marketing journals for the last ten years, I have noticed that a new type of journal has tapped into it: open access (OA) journals.
The first time I have ever heard about OA journals was during a dinner in an international marketing conference held in April 2015 in my country, Tunisia. Many of the colleagues at the dinner table were enthusiastic about having secured publications in a “new” marketing journal published by “IBIMA”. Back in my hometown (Gafsa), I took a quick look at IBIMA Publishing’s website. The thing that I remember the most from that visit is that IBIMA’s website looked odd to me. Then a few years later, while conducting some research on marketing journals, I noticed some puzzling results for a particular journal. Investigating the case of that journal, I came to realize that a scam journal was brandjacking the identity of the flagship journal of the UK-based Academy of Marketing’s, Journal of Marketing Management.
Undertaking this research, terms such“Predatory publishers”, “Beall’s List”, and “Think, Check, Submit” were new discoveries for me. This was also the trigger point of a painful yet insightful research experience that lasted an entire year (from May 2019 to May 2020).
Beall’s list was no longer available (shutdown in January 2017), and I had no access to Cabells’ Predatory Reports. Freely available lists were either outdated or too specialized (mainly Science, Technology, and Medicine) to be useful. So, I searched for journals that have titles that are identical or confusingly similar to those of well-known, prestigious, non-predatory marketing journals. Using this procedure, I identified 12 journals and then visited the websites of each of these 12 journals to collect information about both the publisher and the journal; that is, is the journal OA or not, its Article Processing Charges, whether the journal had an editor in chief or not, the names of its review board members and their affiliations (if any), the journal’s ISSNs, etc. I even emailed an eminent marketing scholar that I was stunned to see his name included in the editorial board of a suspicious journal.
With one journal discarded, I had a list of 11 suspicious journals (Journal A to Journal K).
Having identified the 11 publishers of these 11 journals, I then consulted three freely available and up-to-date lists of predatory publishers: the Dolos List, the Kscien List, and the Stop Predatory Journals List. The aim of consulting these lists was to check whether I was wrong or right in qualifying these publishers as predatory. The verdict was unequivocal; each of the 11 publishers were listed in all three of them. These three lists, however, provided no reasons for the inclusion of a particular publisher or a particular journal.
To double-check the list, I used the Directory of Open Access Journals, which is a community-curated online directory that indexes and provides access to high-quality, OA, peer-reviewed journals. None of the 11 journals were indexed in it. To triple-check the list, I used both the 2019 Journal Quality List of the Australian Business Deans Council and the 2018 Academic Journal Guide by the Chartered Association of Business Schools. None of the 11 journals were ranked in these two lists either.
To be brief, the one year of endeavor resulted in a paper I submitted to the prestigious academic journal, Scientometrics, published by Springer Nature, and my paper was accepted and published online in late October 2020. In that paper, I reported the findings of a study that examined the extent of citations received by articles published in ten predatory marketing journals (as one of the 11 journals under scrutiny was an “empty journal”; that is, with no archives). The results indicated that some of these journals received quite a few citations with a median of 490 citations, with one journal receiving 6,296 citations (see also Case Study below).
I entitled the article “Citation contagion: A citation analysis of selected predatory marketing journals.” Some people may or may not like the framing in terms of “contagion” and “contamination” (especially in these COVID times), but I wanted the title to be striking enough to attract more readership. Those who read the article may see it as a call for marketing researchers to “not submit their (possibly interesting) knowledge products to any journal before checking that the publication outlet they are submitting to is a non-predatory journal.” Assuming that the number of citations an article receives signals its quality, the findings in my study indicate that some of the articles published in these predatory journals deserved better publication venues. I believe that most of the authors of these articles were well-intentioned and did not know that the journals they were submitting to were predatory.
A few months earlier and having no access to Cabells databases, I read each of the posts in their blog trying identify marketing journals that were indexed in Predatory Reports. Together with Cabells, our message to the marketing research community is that 10 of the 11 journals that I have investigated were already listed (or under-review for inclusion) in Predatory Reports. I believe my study has revealed only the tip of the iceberg. Predatory Reports now indexes 140 journals related to the subject of Marketing (which represents 1% of the total number of journals listed in Predatory Reports). Before submitting your papers to an OA marketing journal, you can use Predatory Reports to verify that it is legitimate.
The study completed by Dr. Moussa provides an excellent primer on how to research and identify predatory journals (writes Simon Linacre). As such, it is instructive to look at one of the journals highlighted in Dr. Moussa’s article in more detail.
Dr. Moussa rightly suspected that the British Journal of Marketing Studies looked suspicious due to its familiar-sounding title. This is a well-used strategy by predatory publishers to deceive authors who do not make themselves familiar with the original journal. In this case, the British Journal of Marketing Studies sounds similar to a number of potential journals in this subject discipline.
As Dr. Moussa also points out, a questionable journal’s website will often fail to stand up to a critical eye. For example, the picture below shows the “offices” of BJMS – a small terraced house in Southern England, which seems an unlikely location for an international publishing house. This journal’s website also contains a number of other tells that while not singularly defining of predatory publishers, certainly provide indicators: prominent phone numbers, reference to an ‘Impact Factor’ (not from Clarivate), fake indexation in databases (eg DOAJ), no editor contact details, and/or fake editor identity.
What is really interesting about Dr. Moussa’s piece is his investigation of citation activity. We can see from the data below that ‘Journal I’ (which is the British Journal of Marketing Studies) that both total citations and the most citations received by a single article are significant, and represent what is known as ‘citation leakage’ where citations are made to and from predatory journals. As articles in these journals are unlikely to have had any peer review, publication ethics checks or proof checks, their content is unreliable and skews citation data for reputable research and journals.
Predatory journal: Journal I (BJMS)
Total number of citations received: 1,331
Number of citations received by the most cited article: 99
The most cited article was published in: 2014
Number of citations received from SSCI-indexed journals: 3
Number of citations received from FT50 listed journals: 0
It is a familiar refrain from The Source, but it bears repeating – as an author you should do due diligence on where you publish your work and ‘research your research’. Using your skills as a researcher for publication and not just what you want to publish will save a huge amount of pain in the future, both for avoiding the bad journals and choosing the good ones.
The webinar outlines an innovative collaborative solution to the problem of “citation contamination”—citations to content published in predatory journals with a variety of bad publication practices, such as invented editorial boards and lack of peer review, hiding in plain sight in authors’ bibliographies.
DON’T MISS IT! On Thursday, November 12, 2020, at 11:00 am Eastern / 8:00 am Pacific, Kathleen and Liz will host a live screening with real-time chat followed by a Q&A!
Earlier this year Cabells (@CabellsPublish) published an A-Z list of issues regarding predatory publishing practices, with one Tweet a week going through the entire alphabet. In this week’s blog, Simon Linacre republishes all 26 tweets in one place as a primer on how to successfully deal with the phenomenon
B = British. Questionable journals often use ‘British’ as a proxy for quality. Over 600 include this descriptor in the Predatory Reports database, many more than in the Journalytics database of recommended journals
C = Conferences. Predatory conferences bear the same hallmarks as #PredatoryJournals – few named academics involved, too-good-to-be-true fees & poorly worded/designed websites. Some tips here.
D = Dual Publication. Publishing in #PredatoryJournals can lead to dual publication if the same article is then published in a legitimate journal
E = Editor (or lack of). Always check if a journal has an Editor, with an institutional email address and corroborated affiliation. Lack of an Editor or their details can indicate poor quality or predatory nature
F = Font. Look at fonts used by journals in emails or on web pages – clashing colors, outsized lettering or mixed up fonts may indicate predatory behavior
H = H-Index. Authors’ and journals’ #H_index scores can be skewed by #PredatoryJournal activity
I = ISSN. Over 4,000 out of 13,000 (30%) journals on @CabellsPublish Predatory Reports database claim an ISSN (some genuine, some fake)
J = JIF. Always check the Journal Impact Factor (JIF) claims on journal websites as many #PredatoryJournals falsely claim them. Check via @clarivateMaster Journal List
K = Knowledge. Spotting #PredatoryJournals and recognising legitimate #Journals can be important as subject knowledge for publication. ‘Research your research’ by using tools such as @CabellsPublish databases
L = Legitimacy. First responsibility for authors is to check legitimacy of journals – use checklists, peer recommendations and @CabellsPublish Predatory Reports database
M = Membership scams. Beware any journal that offers membership of a group as a condition of publication. Check existence of group and journal credentials on @CabellsPublish Journal databases
P = Predatory Reports. The new name for the @CabellsPublish database of 13,400 #PredatoryJournals
Q = Quick publication. Peer review is a long process typically lasting months. Beware journals offering quick #PeerReview and publication in days or even weeks
R = Research. Academic #Research should also include research on #Journals to enable #PublicationEthics and #ResearchIntegrity are followed. Use @CabellsPublish Predatory Reports to support your research
S = Spam. Journal emails to solicit articles are often predatory (see below for ironic example). Authors can check the legitimacy of journals by using the @CabellsPublish Predatory Reports database
T = Top Tips. Use our research on #PredatoryJournals and how to spot #PredatoryPublishing practices
U = Unknown. Never trust your research to any unknown editor, journal or publisher #PredatoryJournals #PredatoryAlphabet via @CabellsPublish
V = Vocabulary. Look out for ‘unusual’ vocabulary in predatory journal solicitation emails (i.e., “…your precious paper…”) #PredatoryJournals #PredatoryAlphabet via @CabellsPublish
W = Website. Look for red flags on a journal’s website such as dead links, poor grammar/spelling, no address for journal or publisher, or no contact info (just web-form)
X = There are no current ISSNs with more than one ‘X’ at the end, which serves as a check digit. Be aware of fake ISSNs with multiple X’s
Y = Yosemite Sam. Always check the backgrounds of Editors and Editorial Board members of journals – if they are a cartoon character, maybe don’t submit your article
Z = Zoology. The US Open Zoology Journal is one of 35 zoology titles on the @CabellsPublish Predatory Reports database. It is not open, is not from the US and has no articles, but will accept your $300 to submit an article
This week is Open Access Week, which you will not have missed due to the slew of Twitter activity, press releases and thought pieces being published – unless you are an author, perhaps. In this week’s blog, Simon Linacre focuses on academic researchers who can often be overlooked by the OA conversation, despite the fact they should be the focus of the discussion.
The other day, I was talking to my 16 year-old son about university, as he has started to think about what he might study and where he might like to go (“Dunno” and “dunno” are currently his favourite subjects and destinations). In order to spark some interest in the thought of higher education, I told him about how great the freedom was to be away, the relaxed lifestyle, and the need to be responsible for your own actions, such as handing in your work on time, even if you had to pull an all-nighter.
“What do you mean ‘hand in your work’?”, he said.
“You know, put my essay in my tutor’s pigeon hole”, I said.
“Why didn’t you just email it? And what do pigeons have to do with it?”, he replied.
Yes, university in the early 90s was a very different beast than today, and I decided to leave pigeons out of the ensuing discussion, but it highlighted to me that while a non-digital university experience is now just a crusty anecdote for students in education today, the transition from the 80s and 90s to the present state of affairs is the norm for those teaching in today’s universities. And in addition, many of the activities and habits that established themselves 20 to 30 years ago and beyond are still in existence, albeit changed to adapt with new technology.
One of these activities that has changed, but remained the same, is of course academic publishing. In the eyes of many people, publishing now will differ incredibly to what it was in the 80s pre-internet – physical vs digital, delayed vs instant, subscription vs open. But while the remnants of the older forms of publishing remain in the shape of page numbers or journal issues, there are still shadows from the introduction of in the early 2000s. This was brought home to me in some webinars recently in Turkey, Ukraine and India (reported here) where the one common question about predatory journals was: “Are all open access journals predatory?”
To those of us who have worked in publishing or to Western academics, this may seem a naïve question. But it is not. Open Access articles – that is, an article which is both free to read on the internet and free to re-use – are still relatively unknown by many academics around the world. In addition, being asked to pay money to publish is still not the norm – most journals listed by the Directory of Open Access Journals do not charge an Article Processing Charge (APC) – and publisher marketing communications are dominated by spam emails from predatory journals rather than press releases during Open Access Week. As such, while the dial may have moved appreciably in Europe and North America following initiatives such as Plan S and high-profile standoffs such as that between the University of California and Elsevier, discussion about OA may not have been replicated elsewhere.
So, while there will be many interesting conversations about Open Access (and delta Think has some fascinating data here), it is also important not to forget many authors may be hearing about it for the first time, or previously may have only heard negative or misleading information. Thankfully, there are plenty of useful resources out there, such as this introduction from Charlesworth Author Services to help authors identify the right OA outlet for their research. And of course, authors should remember that most Open Access journals are not predatory – but to be on the safe side, they can check our Predatory Reports database or criteria to judge for themselves.
According to some research, India has the unfortunate distinction of having both the highest number of predatory journals based there, and the highest number of authors publishing in them. In this week’s blog, Simon Linacre answers some of the questions researchers in that country have regarding authentic publishing practices.
During the latter part of 2020, instead of jetting off to typical destinations in the scholarly communications calendar such as Frankfurt and Charleston, some of my energies have been focused on delivering a number of short webinars on predatory publishing to a variety of Indian institutions. According to an oft-quoted article by Shen and Bjork in 2015, India has the largest number of authors who have published in predatory journals, and Cabells knows from its own data that one of the most common countries to find predatory journals originating is in India.
There are probably a number of reasons that account for this, but rather than speculate it is perhaps better to try to act and support Indian authors who do not want to fall into the numerous traps laid for them. One aspect of this are the recent activities of the University Grants Commission (UGC) and its Consortium for Academic and Research Ethics (CARE), which have produced a list of recommended journals for Indian scholars to use.
However, many in India are still getting caught out as some journals have been cloned or hijacked, while others use the predatory journal tactic of simply lying about their listing by UGC-CARE, Scopus, Web of Science or Cabells in order to attract authorship. Following one webinar last week to an Indian National Institute of Technology (NIT), there was a flurry of questions from participants that deserved more time than allowed, so I thought it would be worth sharing some of these questions and my answers here so others can hopefully pick up a few tips when they are making that crucial decision to publish their research.
What’s the difference between an Impact Factor and CiteScore? Well, they both look and feel the same, but the Impact Factor (IF) records a journal’s published articles over a two year period and how they have been cited in the following year in other Web of Science-indexed journals, whereas the CiteScore records a journal’s published documents over a three year period before counting citations the following year.
How do I know if an Impact Factor is fake? Until recently, this was tricky, but now Clarivate Analytics has made its IFs for the journals it indexes for the previous year available on its Master Journal List.
If I publish an article in a predatory journal, can the paper be removed? A submission can be made to the publisher for an article to be retracted, however a predatory publisher is very unlikely to accede to the request and will probably just ignore it. If they do respond, they have been known to ask for more money – in addition to the APC that has already been paid – to carry out the request, effectively blackmailing the author.
If I publish an article in a predatory journal, can I republish it elsewhere? Sadly, dual publication is a breach of publication ethics whether the original article is published in a predatory journal or a legitimate one. The best course of action is to build on the original article with further research so that it is substantially different from the original and publish the subsequent paper in a legitimate journal.
How can I tell if a journal is from India or not? If the origin of the journal is important, the information should be available on the journal’s website. It can also be checked using other sources, such as Scimago which provides country-specific data on journals.
Predatory publishing is undoubtedly a global phenomenon, but with unique characteristics in different countries. In this week’s blog, Simon Linacre shares insight from Russia and a group of researchers keen to shine the spotlight on breaches in publication ethics from within their own country.
For many of us, the Summer months are usually filled with holidays, conferences and a less than serious new agenda. The so-called ‘silly season’ could reliably be identified by news stories of the cat-stuck-in-tree variety, signaling to us all that there was nothing going on and we could safely ignore the news and concentrate on more important things, such as what cocktail to order next from the poolside bar.
It is hard to put a finger on it, but since around 2015 the Summer months seem to have been populated almost exclusively by epoch-making events from which it is impossible to escape. Whether it’s Brexit, COVID or any number of natural or man-made disasters, the news cycle almost seems to go up a gear rather than start to freewheel. And news stories in scholarly communications are no different. This summer saw a number of big stories, including one in Nature Index regarding alleged plagiarism and article publications in predatory journals by Russian university officials. Intrigued, I contacted the research group behind the investigation to learn more.
The group in question is the Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences, and earlier this year they compiled what they claimed to be the first evidence of large-scale ‘translation plagiarism’ by Russian authors in English-language journals (“Predatory Journals at Scopus and WoS: Translation Plagiarism from Russian Sources” (2020). Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences in collaboration with Anna A. Abalkina, Alexei S. Kassian, Larisa G. Melikhova). In total, the Commission said it had detected 94 predatory journals with259 articles from Russian authors, many of which were plagiarised after being translated from Russian into English.
In addition, the study saw that over 1,100 Russian authors had put their names to translated articles which were published in predatory journals. These included heads of departments at Russian universities, and in the case of three authors over 100 publications each in predatory journals. The report (the original can be found here) was authored by some of the founders of Dissernet, a Russian research project which is developing a database of mainly Russian journals which publish plagiarised articles or violate some other criteria of publication ethics. They are concerned that the existence of paper mills in Russia that spam authors and offer easy publication in journals is leading to wide-ranging breaches of publication ethics, supported by inflated metrics appearing to lend some legitimacy to them. Cabells hopes to be able to support the work of Dissernet in highlighting the problem in Russia and internationally, so watch this space.