Guest Post: A look at citation activity of predatory marketing journals

This week we are pleased to feature a guest post from Dr. Salim Moussa, Assistant Professor of Marketing at ISEAH at the University of Gafsa in Tunisia. Dr. Moussa has recently published insightful research on the impact predatory journals have had on the discipline of marketing and, together with Cabells’ Simon Linacre, has some cautionary words for his fellow researchers in that area.

Academic journals are important to marketing scholars for two main reasons: (a) journals are the primary medium through which they transmit/receive scholarly knowledge; and (b) tenure, promotion, and grant decisions depend mostly on the journals in which they have published. Selecting the right journal to which one would like to submit a manuscript is thus a crucial decision. Furthermore, the overabundance of academic marketing journals -and the increasing “Publish or Perish” pressure – makes this decision even more difficult.

The “market” of marketing journals is extremely broad, with Cabells’ Journalytics indexing 965 publication venues that that are associated with “marketing” in their aims and scope. While monitoring the market of marketing journals for the last ten years, I have noticed that a new type of journal has tapped into it: open access (OA) journals.

The first time I have ever heard about OA journals was during a dinner in an international marketing conference held in April 2015 in my country, Tunisia. Many of the colleagues at the dinner table were enthusiastic about having secured publications in a “new” marketing journal published by “IBIMA”. Back in my hometown (Gafsa), I took a quick look at IBIMA Publishing’s website. The thing that I remember the most from that visit is that IBIMA’s website looked odd to me. Then a few years later, while conducting some research on marketing journals, I noticed some puzzling results for a particular journal. Investigating the case of that journal, I came to realize that a scam journal was brandjacking the identity of the flagship journal of the UK-based Academy of Marketing’s, Journal of Marketing Management.

Undertaking this research, terms such“Predatory publishers”, “Beall’s List”, and “Think, Check, Submit” were new discoveries for me. This was also the trigger point of a painful yet insightful research experience that lasted an entire year (from May 2019 to May 2020).

Beall’s list was no longer available (shutdown in January 2017), and I had no access to Cabells’ Predatory Reports. Freely available lists were either outdated or too specialized (mainly Science, Technology, and Medicine) to be useful. So, I searched for journals that have titles that are identical or confusingly similar to those of well-known, prestigious, non-predatory marketing journals. Using this procedure, I identified 12 journals and then visited the websites of each of these 12 journals to collect information about both the publisher and the journal; that is, is the journal OA or not, its Article Processing Charges, whether the journal had an editor in chief or not, the names of its review board members and their affiliations (if any), the journal’s ISSNs, etc. I even emailed an eminent marketing scholar that I was stunned to see his name included in the editorial board of a suspicious journal.

With one journal discarded, I had a list of 11 suspicious journals (Journal A to Journal K).

Having identified the 11 publishers of these 11 journals, I then consulted three freely available and up-to-date lists of predatory publishers: the Dolos List, the Kscien List, and the Stop Predatory Journals List. The aim of consulting these lists was to check whether I was wrong or right in qualifying these publishers as predatory. The verdict was unequivocal; each of the 11 publishers were listed in all three of them. These three lists, however, provided no reasons for the inclusion of a particular publisher or a particular journal.

To double-check the list, I used the Directory of Open Access Journals, which is a community-curated online directory that indexes and provides access to high-quality, OA, peer-reviewed journals. None of the 11 journals were indexed in it. To triple-check the list, I used both the 2019 Journal Quality List of the Australian Business Deans Council and the 2018 Academic Journal Guide by the Chartered Association of Business Schools. None of the 11 journals were ranked in these two lists either.

To be brief, the one year of endeavor resulted in a paper I submitted to the prestigious academic journal, Scientometrics, published by Springer Nature, and my paper was accepted and published online in late October 2020. In that paper, I reported the findings of a study that examined the extent of citations received by articles published in ten predatory marketing journals (as one of the 11 journals under scrutiny was an “empty journal”; that is, with no archives). The results indicated that some of these journals received quite a few citations with a median of 490 citations, with one journal receiving 6,296 citations (see also Case Study below).

I entitled the article “Citation contagion: A citation analysis of selected predatory marketing journals.” Some people may or may not like the framing in terms of “contagion” and “contamination” (especially in these COVID times), but I wanted the title to be striking enough to attract more readership. Those who read the article may see it as a call for marketing researchers to “not submit their (possibly interesting) knowledge products to any journal before checking that the publication outlet they are submitting to is a non-predatory journal.” Assuming that the number of citations an article receives signals its quality, the findings in my study indicate that some of the articles published in these predatory journals deserved better publication venues. I believe that most of the authors of these articles were well-intentioned and did not know that the journals they were submitting to were predatory. 

A few months earlier and having no access to Cabells databases, I read each of the posts in their blog trying identify marketing journals that were indexed in Predatory Reports. Together with Cabells, our message to the marketing research community is that 10 of the 11 journals that I have investigated were already listed (or under-review for inclusion) in Predatory Reports. I believe my study has revealed only the tip of the iceberg. Predatory Reports now indexes 140 journals related to the subject of Marketing (which represents 1% of the total number of journals listed in Predatory Reports). Before submitting your papers to an OA marketing journal, you can use Predatory Reports to verify that it is legitimate.

Case Study

The study completed by Dr. Moussa provides an excellent primer on how to research and identify predatory journals (writes Simon Linacre). As such, it is instructive to look at one of the journals highlighted in Dr. Moussa’s article in more detail.

Dr. Moussa rightly suspected that the British Journal of Marketing Studies looked suspicious due to its familiar-sounding title. This is a well-used strategy by predatory publishers to deceive authors who do not make themselves familiar with the original journal. In this case, the British Journal of Marketing Studies sounds similar to a number of potential journals in this subject discipline.

As Dr. Moussa also points out, a questionable journal’s website will often fail to stand up to a critical eye. For example, the picture below shows the “offices” of BJMS – a small terraced house in Southern England, which seems an unlikely location for an international publishing house. This journal’s website also contains a number of other tells that while not singularly defining of predatory publishers, certainly provide indicators: prominent phone numbers, reference to an ‘Impact Factor’ (not from Clarivate), fake indexation in databases (eg DOAJ), no editor contact details, and/or fake editor identity.

What is really interesting about Dr. Moussa’s piece is his investigation of citation activity. We can see from the data below that ‘Journal I’ (which is the British Journal of Marketing Studies) that both total citations and the most citations received by a single article are significant, and represent what is known as ‘citation leakage’ where citations are made to and from predatory journals. As articles in these journals are unlikely to have had any peer review, publication ethics checks or proof checks, their content is unreliable and skews citation data for reputable research and journals.

  • Predatory journal: Journal I (BJMS)
  • Total number of citations received: 1,331
  • Number of citations received by the most cited article: 99
  • The most cited article was published in: 2014
  • Number of citations received from SSCI-indexed journals: 3
  • Number of citations received from FT50 listed journals: 0
Predatory Reports entry for BJMS

It is a familiar refrain from The Source, but it bears repeating – as an author you should do due diligence on where you publish your work and ‘research your research’. Using your skills as a researcher for publication and not just what you want to publish will save a huge amount of pain in the future, both for avoiding the bad journals and choosing the good ones.

Cabells and Inera present free webinar: Flagging Predatory Journals to Fight “Citation Contamination”

Cabells and Inera are excited to co-sponsor the free on-demand webinar “Flagging Predatory Journals to Fight ‘Citation Contamination'” now available to stream via SSP OnDemand. Originally designed as a sponsored session for the 2020 SSP Annual Meeting, this webinar is presented by Kathleen Berryman of Cabells and Liz Blake of Inera, with assistance from Bruce Rosenblum and Sylvia Izzo Hunter, also from Inera.

The webinar outlines an innovative collaborative solution to the problem of “citation contamination”—citations to content published in predatory journals with a variety of bad publication practices, such as invented editorial boards and lack of peer review, hiding in plain sight in authors’ bibliographies.

DON’T MISS IT! On Thursday, November 12, 2020, at 11:00 am Eastern / 8:00 am Pacific, Kathleen and Liz will host a live screening with real-time chat followed by a Q&A!

For relevant background reading on this topic, we recommend these Scholarly Kitchen posts:

The A-Z’s of predatory publishing

Earlier this year Cabells (@CabellsPublish) published an A-Z list of issues regarding predatory publishing practices, with one Tweet a week going through the entire alphabet. In this week’s blog, Simon Linacre republishes all 26 tweets in one place as a primer on how to successfully deal with the phenomenon

A = American. US probable source of #PredatoryJournals activity as it lends credence to claims of legitimacy a journal may adopt to hoodwink authors into submitting articles #PredatoryAlphabet #PredatoryJournal #PublicationEthics

B = British. Questionable journals often use ‘British’ as a proxy for quality. Over 600 include this descriptor in the Predatory Reports database, many more than in the Journalytics database of recommended journals

C = Conferences. Predatory conferences bear the same hallmarks as #PredatoryJournals – few named academics involved, too-good-to-be-true fees & poorly worded/designed websites. Some tips here.

D = Dual Publication. Publishing in #PredatoryJournals can lead to dual publication if the same article is then published in a legitimate journal

E = Editor (or lack of). Always check if a journal has an Editor, with an institutional email address and corroborated affiliation. Lack of an Editor or their details can indicate poor quality or predatory nature

F = Font. Look at fonts used by journals in emails or on web pages – clashing colors, outsized lettering or mixed up fonts may indicate predatory behavior

G = Germany, which takes #PredatoryJournals seriously through university-level checks, highlighting the issue and exposing problems in a 2018 investigation

H = H-Index. Authors’ and journals’ #H_index scores can be skewed by #PredatoryJournal activity

I = ISSN. Over 4,000 out of 13,000 (30%) journals on @CabellsPublish Predatory Reports database claim an ISSN (some genuine, some fake)

J = JIF. Always check the Journal Impact Factor (JIF) claims on journal websites as many #PredatoryJournals falsely claim them. Check via @clarivate Master Journal List

K = Knowledge. Spotting #PredatoryJournals and recognising legitimate #Journals can be important as subject knowledge for publication. ‘Research your research’ by using tools such as @CabellsPublish databases

L = Legitimacy. First responsibility for authors is to check legitimacy of journals – use checklists, peer recommendations and @CabellsPublish Predatory Reports database

M = Membership scams. Beware any journal that offers membership of a group as a condition of publication. Check existence of group and journal credentials on @CabellsPublish Journal databases

N = Nature. Using a trusted scholarly brand such as @nature can help identify, understand and define #PredatoryJournals, with dozens of articles on the subject via @NatureNews

O = OMICS. In 2019, the FTC fined publisher OMICS over $50m for deceptive publishing practices

P = Predatory Reports. The new name for the @CabellsPublish database of 13,400 #PredatoryJournals

Q = Quick publication. Peer review is a long process  typically lasting months. Beware journals offering quick #PeerReview and publication in days or even weeks

R = Research. Academic #Research should also include research on #Journals to enable #PublicationEthics and #ResearchIntegrity are followed. Use @CabellsPublish Predatory Reports to support your research

S = Spam. Journal emails to solicit articles are often predatory (see below for ironic example). Authors can check the legitimacy of journals by using the @CabellsPublish Predatory Reports database

T = Top Tips. Use our research on #PredatoryJournals and how to spot #PredatoryPublishing practices

U = Unknown. Never trust your research to any unknown editor, journal or publisher #PredatoryJournals #PredatoryAlphabet via @CabellsPublish

V = Vocabulary. Look out for ‘unusual’ vocabulary in predatory journal solicitation emails (i.e., “…your precious paper…”) #PredatoryJournals #PredatoryAlphabet via
@CabellsPublish

W = Website. Look for red flags on a journal’s website such as dead links, poor grammar/spelling, no address for journal or publisher, or no contact info (just web-form)

X = There are no current ISSNs with more than one ‘X’ at the end, which serves as a check digit. Be aware of fake ISSNs with multiple X’s

Y = Yosemite Sam. Always check the backgrounds of Editors and Editorial Board members of journals – if they are a cartoon character, maybe don’t submit your article

Z = Zoology. The US Open Zoology Journal is one of 35 zoology titles on the @CabellsPublish Predatory Reports database. It is not open, is not from the US and has no articles, but will accept your $300 to submit an article

Open with purpose

This week is Open Access Week, which you will not have missed due to the slew of Twitter activity, press releases and thought pieces being published – unless you are an author, perhaps. In this week’s blog, Simon Linacre focuses on academic researchers who can often be overlooked by the OA conversation, despite the fact they should be the focus of the discussion.

The other day, I was talking to my 16 year-old son about university, as he has started to think about what he might study and where he might like to go (“Dunno” and “dunno” are currently his favourite subjects and destinations). In order to spark some interest in the thought of higher education, I told him about how great the freedom was to be away, the relaxed lifestyle, and the need to be responsible for your own actions, such as handing in your work on time, even if you had to pull an all-nighter.

“What do you mean ‘hand in your work’?”, he said.

“You know, put my essay in my tutor’s pigeon hole”, I said.

“Why didn’t you just email it? And what do pigeons have to do with it?”, he replied.

Yes, university in the early 90s was a very different beast than today, and I decided to leave pigeons out of the ensuing discussion, but it highlighted to me that while a non-digital university experience is now just a crusty anecdote for students in education today, the transition from the 80s and 90s to the present state of affairs is the norm for those teaching in today’s universities. And in addition, many of the activities and habits that established themselves 20 to 30 years ago and beyond are still in existence, albeit changed to adapt with new technology.

One of these activities that has changed, but remained the same, is of course academic publishing. In the eyes of many people, publishing now will differ incredibly to what it was in the 80s pre-internet – physical vs digital, delayed vs instant, subscription vs open. But while the remnants of the older forms of publishing remain in the shape of page numbers or journal issues, there are still shadows from the introduction of in the early 2000s. This was brought home to me in some webinars recently in Turkey, Ukraine and India (reported here) where the one common question about predatory journals was: “Are all open access journals predatory?”

To those of us who have worked in publishing or to Western academics, this may seem a naïve question. But it is not. Open Access articles – that is, an article which is both free to read on the internet and free to re-use – are still relatively unknown by many academics around the world. In addition, being asked to pay money to publish is still not the norm – most journals listed by the Directory of Open Access Journals do not charge an Article Processing Charge (APC) – and publisher marketing communications are dominated by spam emails from predatory journals rather than press releases during Open Access Week. As such, while the dial may have moved appreciably in Europe and North America following initiatives such as Plan S and high-profile standoffs such as that between the University of California and Elsevier, discussion about OA may not have been replicated elsewhere.

So, while there will be many interesting conversations about Open Access (and delta Think has some fascinating data here), it is also important not to forget many authors may be hearing about it for the first time, or previously may have only heard negative or misleading information. Thankfully, there are plenty of useful resources out there, such as this introduction from Charlesworth Author Services to help authors identify the right OA outlet for their research. And of course, authors should remember that most Open Access journals are not predatory – but to be on the safe side, they can check our Predatory Reports database or criteria to judge for themselves.

Empowering India’s Academia

According to some research, India has the unfortunate distinction of having both the highest number of predatory journals based there, and the highest number of authors publishing in them. In this week’s blog, Simon Linacre answers some of the questions researchers in that country have regarding authentic publishing practices.

During the latter part of 2020, instead of jetting off to typical destinations in the scholarly communications calendar such as Frankfurt and Charleston, some of my energies have been focused on delivering a number of short webinars on predatory publishing to a variety of Indian institutions. According to an oft-quoted article by Shen and Bjork in 2015, India has the largest number of authors who have published in predatory journals, and Cabells knows from its own data that one of the most common countries to find predatory journals originating is in India.

There are probably a number of reasons that account for this, but rather than speculate it is perhaps better to try to act and support Indian authors who do not want to fall into the numerous traps laid for them. One aspect of this are the recent activities of the University Grants Commission (UGC) and its Consortium for Academic and Research Ethics (CARE), which have produced a list of recommended journals for Indian scholars to use.

However, many in India are still getting caught out as some journals have been cloned or hijacked, while others use the predatory journal tactic of simply lying about their listing by UGC-CARE, Scopus, Web of Science or Cabells in order to attract authorship. Following one webinar last week to an Indian National Institute of Technology (NIT), there was a flurry of questions from participants that deserved more time than allowed, so I thought it would be worth sharing some of these questions and my answers here so others can hopefully pick up a few tips when they are making that crucial decision to publish their research.

  1. What’s the difference between an Impact Factor and CiteScore? Well, they both look and feel the same, but the Impact Factor (IF) records a journal’s published articles over a two year period and how they have been cited in the following year in other Web of Science-indexed journals, whereas the CiteScore records a journal’s published documents over a three year period before counting citations the following year.
  2. How do I know if an Impact Factor is fake? Until recently, this was tricky, but now Clarivate Analytics has made its IFs for the journals it indexes for the previous year available on its Master Journal List.
  3. If I publish an article in a predatory journal, can the paper be removed? A submission can be made to the publisher for an article to be retracted, however a predatory publisher is very unlikely to accede to the request and will probably just ignore it. If they do respond, they have been known to ask for more money – in addition to the APC that has already been paid – to carry out the request, effectively blackmailing the author.
  4. If I publish an article in a predatory journal, can I republish it elsewhere? Sadly, dual publication is a breach of publication ethics whether the original article is published in a predatory journal or a legitimate one. The best course of action is to build on the original article with further research so that it is substantially different from the original and publish the subsequent paper in a legitimate journal.
  5. How can I tell if a journal is from India or not? If the origin of the journal is important, the information should be available on the journal’s website. It can also be checked using other sources, such as Scimago which provides country-specific data on journals.

Simon Linacre, Cabells

The RAS Commission for Counteracting the Falsification of Scientific Research

Predatory publishing is undoubtedly a global phenomenon, but with unique characteristics in different countries. In this week’s blog, Simon Linacre shares insight from Russia and a group of researchers keen to shine the spotlight on breaches in publication ethics from within their own country.

For many of us, the Summer months are usually filled with holidays, conferences and a less than serious new agenda. The so-called ‘silly season’ could reliably be identified by news stories of the cat-stuck-in-tree variety, signaling to us all that there was nothing going on and we could safely ignore the news and concentrate on more important things, such as what cocktail to order next from the poolside bar.

Not anymore.

It is hard to put a finger on it, but since around 2015 the Summer months seem to have been populated almost exclusively by epoch-making events from which it is impossible to escape. Whether it’s Brexit, COVID or any number of natural or man-made disasters, the news cycle almost seems to go up a gear rather than start to freewheel. And news stories in scholarly communications are no different. This summer saw a number of big stories, including one in Nature Index regarding alleged plagiarism and article publications in predatory journals by Russian university officials. Intrigued, I contacted the research group behind the investigation to learn more.

The group in question is the Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences, and earlier this year they compiled what they claimed to be the first evidence of large-scale ‘translation plagiarism’ by Russian authors in English-language journals (“Predatory Journals at Scopus and WoS:  Translation Plagiarism from Russian Sources” (2020). Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences in collaboration with Anna A. Abalkina, Alexei S. Kassian, Larisa G. Melikhova). In total, the Commission said it had detected 94 predatory journals with259 articles from Russian authors, many of which were plagiarised after being translated from Russian into English.

In addition, the study saw that over 1,100 Russian authors had put their names to translated articles which were published in predatory journals. These included heads of departments at Russian universities, and in the case of three authors over 100 publications each in predatory journals. The report (the original can be found here) was authored by some of the founders of Dissernet, a Russian research project which is developing a database of mainly Russian journals which publish plagiarised articles or violate some other criteria of publication ethics. They are concerned that the existence of paper mills in Russia that spam authors and offer easy publication in journals is leading to wide-ranging breaches of publication ethics, supported by inflated metrics appearing to lend some legitimacy to them. Cabells hopes to be able to support the work of Dissernet in highlighting the problem in Russia and internationally, so watch this space.

How do you know you can trust a journal?

As many readers know, this week is Peer Review Week, the annual opportunity for those involved in scholarly communication and research to celebrate and learn about all aspects of peer review. As part of this conversation, Simon Linacre reflects on this year’s theme of ‘Trust in Peer Review’ in terms of the important role of peer review in the validation of scholarship, and dangers of predatory behaviour in its absence.


I was asked to deliver a webinar recently to a community of scholars in Eastern Europe and, as always with webinars, I was very worried about the Q&A section at the end. When you deliver a talk in person, you can tell by looking at the crowd what is likely to happen at the end of the presentation and can prepare yourself. A quiet group of people means you may have to ask yourself some pretty tough questions, as no one will put their hand up at the end to ask you anything; a rowdy crowd is likely to throw anything and everything at you. With a webinar, there are no cues, and as such, it can be particularly nerve-shredding.

With the webinar in question, I waited a while for a question and was starting to prepare my quiet crowd response, when a single question popped up in the chat box:

How do you know you can trust a journal?

As with all the best questions, this floored me for a while. How do you know? The usual things flashed across my mind: reputation, whether it’s published known scholars in its field, whether it is indexed by Cabells or other databases, etc. But suddenly the word trust felt a lot more personal than simply a tick box exercise to confirm a journal’s standing. That may confirm it is trustworthy but is that the same as the feeling an individual has when they really trust something or someone?

The issue of trust is often the unsaid part of the global debates that are raging currently, whether it is responses to the coronavirus epidemic, climate change or democracy. Politicians, as always, want the people to trust them; but increasingly their actions seem to be making that trust harder and harder. As I write, the UK put its two top scientists in front of the cameras to give a grave warning about COVID-19 and a second wave of cases. The fact there was no senior politician to join them was highly symbolic.

It is with this background that the choice of the theme Trust in Peer Review is an appropriate one for Peer Review Week (full disclosure: I have recently joined one of the PRW committees to support the initiative). There is a huge groundswell of support by publishers, editors and academics to support both the effectiveness of peer review and the unsung heroes who do the job for little recognition or reward. The absence of which would have profound implications for research and society as a whole.

Which brings me to the answer to the question posed above, which is to ask the opposite: how do you know when you cannot trust a journal? This is easier to answer as you can point to all those characteristics and behaviours that you would want in a journal. We see on a daily basis with our work on Predatory Reports how the absence of crucial aspects of a journal’s workings can cause huge problems for authors. No listed editor, a fake editorial board, a borrowed ISSN, a hijacked journal identity, a made-up impact factor, and – above all – false promises of a robust peer review process. Trust in peer review may require some research on the part of the author in terms of checking the background of the journal, its publisher and its editors, and it may require you to contact the editor, editorial board members or published authors to get personal advice on publishing in that journal. But doing that work in the first place and receiving personal recommendations will build trust in peer review for any authors who have doubts – and collectively for all members of the academic community.

Know before you go

Earlier this week, the Guardian in the UK released its updated university rankings, just the latest of myriad national and international exercises in defining the “best” university. At a time when deciding to go to university is fraught with unkowns, Simon Linacre argues that critical thinking skills are more important than ever.


I’ll admit it, I love it when my own university tops any kind of ranking. The fact that I was there 25 years ago and the subjects and academics taught there are unrecognisable, is of no consequence. That the department I graduated from is the BEST in the country is something I have to tell my wife and colleagues about, despite the arched eyebrows and disdain I can feel from several thousand miles away.

What does this mean? Well, aside from the fact that I must be a walking target for alumni fundraisers, it shows that my critical faculties, if not entirely absent, are certainly being overridden by shameless pride in something I have no right to be proud about. But like your favourite football team, you can’t help pulling for them through thick and thin – when they suck you say “we’re awful!”, and when they do well you say “we’re top!”.

Who’s “we”?

However, deciding which university to go to in a year or two is not the same as choosing a football team to support. You should use every critical faculty you have and hone it until it is razor-sharp before you even think of filling in a form or visiting a campus. And that means you should learn to read a university ranking like you would read a balance sheet before investing in a company, or reviewing a journal before submitting an article. I do not believe there is anything inherently wrong in any ranking as it can provide extremely useful data on which to base a decision. But you need to know where the data came from and how it relates to the investment you are making in your future life and career. This is why we always recommend users of Cabells’ Journalytics database use other relevant data points for their individual circumstances.

This week, the Guardian published its UK university rankings for 2021, with Oxford, St Andrews and Cambridge leading the way overall (full disclosure: I attended St Andrews). Each broad subject is then broken down into separate rankings with the promise that, “unlike other league tables, the Guardian rankings are designed with students in mind.” What, other university rankings do NOT have students in mind? Straight away, the amount of spin adopted here should tell you that (a) you should be careful of other hyperbolae, and (b) you should look at other league tables to see why the Guardian would say this.

And there are plenty of tables to choose from – both nationally and internationally there are dozens of such rankings, all seeking to advise students on which university to choose. Why? Because of the 48 pages of the university guide, six are adverts. Organisations publish rankings guides to sell advertising and cement their reputation as education experts, further enhancing their opportunities to sell education-related advertising in the future. Knowing why there is spin and why these guides exist in the first place should help students understand what information is in front of them and ensure a better decision-making process.

But students should also dig deep into the data. In the “Business, management and marketing” subject ranking, readers are told that, “most universities will boast of having good links with business,” that “group work is a key part of many courses” and “there will also be a practical element to assessment.” But none of these points are addressed in the rankings, which include data on criteria such as course and teaching satisfaction, spend per student and “career after 15 months.” All of this information is relevant but only some has data to back it up.

Sitting with my 12-year-old at breakfast, he looked at the page on architecture (which he has wanted to do since the age of about seven), and decided he should go to Cambridge, UCL or Bath as the top three for that subject. None of those would be a bad choice, but neither would they be an informed one.

Special report: Assessing journal quality and legitimacy

Earlier this year Cabells engaged CIBER Research (http://ciber-research.eu/) to support its product and marketing development work. Today, in collaboration with CIBER, Simon Linacre looks at the findings and implications for scholarly communications globally.


In recent months the UK-based publishing research body CIBER has been working with Cabells to better understand the academic publishing environment both specifically in terms of Medical research publications, and more broadly with regard to the continuing problems posed by predatory journals. While the research was commissioned privately by Cabells, it was always with the understanding that much of the findings could be shared openly to enable a better understanding of these two key areas.

The report — Assessing Journal Quality and Legitimacy: An Investigation into the Experience and Views of Researchers and Intermediaries – with special reference to the Health Sector and Predatory Publishinghas been shared today on CIBER’s website and the following briefly summarizes the key findings following six months’ worth of research:

  • The team at CIBER Research was asked to investigate how researchers in the health domain went about selecting journals to publish their papers, what tools they used to help them, and what their perceptions of new scholarly communications trends were, especially in regard to predatory journals. Through a mixture of questionnaire surveys and qualitative interviews with over 500 researchers and ‘intermediaries’ (i.e. librarians and research managers), research pointed to a high degree of self-sufficiency among researchers regarding journal selection
  • While researchers tended to use tools such as information databases to aid their decision-making, intermediaries focused on sharing their own experiences and providing education and training solutions to researchers. Overall, it was notable how much of a mismatch there was between what researchers said and what intermediaries did or believed
  • The existence of so-called ‘whitelists’ were common on a national and institutional level, as were the emergence of ‘greylists’ of journals to be wary of, however, there seemed to be no list of recommended journals in Medical research areas
  • In China, alongside its huge growth in research and publication output are concerns that predatory publishing could have an impact, with one participant stating that, “More attention is being paid to the potential for predatory publishing and this includes the emergence of Blacklists and Whitelists, which are government-sponsored. However, there is not just one there are many 10 or 20 or 50 different (white)lists in place”
  • In India, the explosion of predatory publishing is perhaps the consequence of educational and research expansion and the absence of infrastructure capacity to deal with it. An additional factor could be a lack of significant impetus at a local level to establish new journals, unlike in countries such as Brazil, however, universities are not legally able to establish new titles themselves. As a result, an immature market has attempted to develop new journals to satisfy scholars’ needs which in turn has led to the rise of predatory publishing in the country
  • Predatory publishing practices seemed to be having an increased impact on mainstream publishing activities globally, with grave risk of “potentially polluting repositories and citation indexes but there seems to have been little follow through by anyone.” National bodies, publishers and funders have failed to follow through on the threat and how it may have diverted funds away from legitimate publications to those engaged in illicit activities
  • Overall, predatory publishing is being driven by publish-or-perish scenarios, particularly with early career researchers (ECRs) where authors are unaware of predatory publishers in general, or of the identity of a specific journal. However, a cynical manipulation of such journals as outlets for publications is also suspected.

 

blog image 2
‘Why do you think researchers publish in predatory journals’

 


CIBER Research is an independent group of senior academic researchers from around the world, who specialize in scholarly communications and publish widely on the topic. Their most recent projects have included studies of early career researchers, digital libraries, academic reputation and trustworthiness.

 

A case study of how bad science spreads

Fake news has been the go-to criticism of the media for some politicians, which in turn has been rejected as propaganda and fear-mongering by journalists. However, as former journalist Simon Linacre argues, the fourth estate needs to have its own house is in order first, and ensure they are not tripped up by predatory journals.


I class myself as a ‘runner’, but only in the very loosest sense. Inspired by my wife taking up running a few years ago, I decided I should exercise consistently instead of the numerous half-hearted, unsuccessful attempts I had made over the years. Three years later I have done a couple of half-marathons, run every other day, and track my performance obsessively on Strava. I have also recently started to read articles on running online, and have subscribed to the magazine Runners World. So yes, I think I may actually be a runner now.

But I’m also an experienced journalist, a huge cynic, and have a bulls*** radar the size of the Grand Canyon, so even while relaxing with my magazine I like to think I can spot fakery a mile off. And so it proved earlier this summer while reading a piece on how hill running can improve your fitness. This was music to my ears as someone who lives half-way up a valley side, but my interest was then piqued when I saw a reference to the study that formed the basis for the piece, which was to an article in the International Journal of Scientific Research. Immediately, I smelt a rat. “There is no way that is the name of a reputable, peer-reviewed journal,” I thought. And I was right.

But that wasn’t even half of the problem.

After checking Cabells’ Predatory Reports database, I found not one but TWO journals are listed on the database with that name, both with long lists of breaches of the Cabells’ criteria that facilitate the identification of predatory journals. I was still curious as to the nature of the research, as it could have been legitimate research in an illegitimate journal, or just bad research, full stop. As it turned out, neither journal had ever published any research on hill running and the benefits of increasing VO2 max. So where was the story from?

After some more digging, an article matching the details in the Runners World piece could be found in a third similarly-named journal, the International Journal of Scientific and Research Publications. The article, far from the recent breakthrough suggested in the August 2020 issue of Runners World, was actually published in August 2017 by two authors from Addis Ababa University in Ethiopia. While the science of the article seems OK, the experiment that produced the results was on just 32 people over 12 weeks, which means it really needs further validation across greater subjects to confirm its findings. Furthermore, while the journal itself was not included in Cabells’ Predatory Reports database, a review found significant failings, including unusually quick peer review processes and, more seriously, that the “owner/Editor of the journal or publisher falsely claims academic positions or qualifications”. The journal has subsequently been added to Predatory Reports, and the article itself has never been cited in the three years since publication.

Yet one question remains: how did a relatively obscure article, published in a predatory journal and that has never been cited, find its way into a news story in a leading consumer magazine? Interestingly, similar research was also quoted on MSN.com in May 2020 which also quoted the International Journal of Scientific Research, while other sites have also quoted the same research but from the International Journal of Scientific and Research Publications. It appears likely that, having been quoted online once, the same story has been doing the rounds for three years like a game of ‘Telephone,’ all based on uncited research that may not have been peer reviewed in the first place, that used a small sample size and was published in a predatory journal.

While no damage has been done here – underlying all this, it does make sense that hill running can aid overall performance – one need only need to think about the string of recent health news stories around the coronavirus to see how one unverified article could sweep like wildfire through news outlets and online. This is the danger that predatory journals pose.