Posts

Open with purpose

This week is Open Access Week, which you will not have missed due to the slew of Twitter activity, press releases and thought pieces being published – unless you are an author, perhaps. In this week’s blog, Simon Linacre focuses on academic researchers who can often be overlooked by the OA conversation, despite the fact they should be the focus of the discussion.

The other day, I was talking to my 16 year-old son about university, as he has started to think about what he might study and where he might like to go (“Dunno” and “dunno” are currently his favourite subjects and destinations). In order to spark some interest in the thought of higher education, I told him about how great the freedom was to be away, the relaxed lifestyle, and the need to be responsible for your own actions, such as handing in your work on time, even if you had to pull an all-nighter.

“What do you mean ‘hand in your work’?”, he said.

“You know, put my essay in my tutor’s pigeon hole”, I said.

“Why didn’t you just email it? And what do pigeons have to do with it?”, he replied.

Yes, university in the early 90s was a very different beast than today, and I decided to leave pigeons out of the ensuing discussion, but it highlighted to me that while a non-digital university experience is now just a crusty anecdote for students in education today, the transition from the 80s and 90s to the present state of affairs is the norm for those teaching in today’s universities. And in addition, many of the activities and habits that established themselves 20 to 30 years ago and beyond are still in existence, albeit changed to adapt with new technology.

One of these activities that has changed, but remained the same, is of course academic publishing. In the eyes of many people, publishing now will differ incredibly to what it was in the 80s pre-internet – physical vs digital, delayed vs instant, subscription vs open. But while the remnants of the older forms of publishing remain in the shape of page numbers or journal issues, there are still shadows from the introduction of in the early 2000s. This was brought home to me in some webinars recently in Turkey, Ukraine and India (reported here) where the one common question about predatory journals was: “Are all open access journals predatory?”

To those of us who have worked in publishing or to Western academics, this may seem a naïve question. But it is not. Open Access articles – that is, an article which is both free to read on the internet and free to re-use – are still relatively unknown by many academics around the world. In addition, being asked to pay money to publish is still not the norm – most journals listed by the Directory of Open Access Journals do not charge an Article Processing Charge (APC) – and publisher marketing communications are dominated by spam emails from predatory journals rather than press releases during Open Access Week. As such, while the dial may have moved appreciably in Europe and North America following initiatives such as Plan S and high-profile standoffs such as that between the University of California and Elsevier, discussion about OA may not have been replicated elsewhere.

So, while there will be many interesting conversations about Open Access (and delta Think has some fascinating data here), it is also important not to forget many authors may be hearing about it for the first time, or previously may have only heard negative or misleading information. Thankfully, there are plenty of useful resources out there, such as this introduction from Charlesworth Author Services to help authors identify the right OA outlet for their research. And of course, authors should remember that most Open Access journals are not predatory – but to be on the safe side, they can check our Predatory Reports database or criteria to judge for themselves.

Empowering India’s Academia

According to some research, India has the unfortunate distinction of having both the highest number of predatory journals based there, and the highest number of authors publishing in them. In this week’s blog, Simon Linacre answers some of the questions researchers in that country have regarding authentic publishing practices.

During the latter part of 2020, instead of jetting off to typical destinations in the scholarly communications calendar such as Frankfurt and Charleston, some of my energies have been focused on delivering a number of short webinars on predatory publishing to a variety of Indian institutions. According to an oft-quoted article by Shen and Bjork in 2015, India has the largest number of authors who have published in predatory journals, and Cabells knows from its own data that one of the most common countries to find predatory journals originating is in India.

There are probably a number of reasons that account for this, but rather than speculate it is perhaps better to try to act and support Indian authors who do not want to fall into the numerous traps laid for them. One aspect of this are the recent activities of the University Grants Commission (UGC) and its Consortium for Academic and Research Ethics (CARE), which have produced a list of recommended journals for Indian scholars to use.

However, many in India are still getting caught out as some journals have been cloned or hijacked, while others use the predatory journal tactic of simply lying about their listing by UGC-CARE, Scopus, Web of Science or Cabells in order to attract authorship. Following one webinar last week to an Indian National Institute of Technology (NIT), there was a flurry of questions from participants that deserved more time than allowed, so I thought it would be worth sharing some of these questions and my answers here so others can hopefully pick up a few tips when they are making that crucial decision to publish their research.

  1. What’s the difference between an Impact Factor and CiteScore? Well, they both look and feel the same, but the Impact Factor (IF) records a journal’s published articles over a two year period and how they have been cited in the following year in other Web of Science-indexed journals, whereas the CiteScore records a journal’s published documents over a three year period before counting citations the following year.
  2. How do I know if an Impact Factor is fake? Until recently, this was tricky, but now Clarivate Analytics has made its IFs for the journals it indexes for the previous year available on its Master Journal List.
  3. If I publish an article in a predatory journal, can the paper be removed? A submission can be made to the publisher for an article to be retracted, however a predatory publisher is very unlikely to accede to the request and will probably just ignore it. If they do respond, they have been known to ask for more money – in addition to the APC that has already been paid – to carry out the request, effectively blackmailing the author.
  4. If I publish an article in a predatory journal, can I republish it elsewhere? Sadly, dual publication is a breach of publication ethics whether the original article is published in a predatory journal or a legitimate one. The best course of action is to build on the original article with further research so that it is substantially different from the original and publish the subsequent paper in a legitimate journal.
  5. How can I tell if a journal is from India or not? If the origin of the journal is important, the information should be available on the journal’s website. It can also be checked using other sources, such as Scimago which provides country-specific data on journals.

Simon Linacre, Cabells

The RAS Commission for Counteracting the Falsification of Scientific Research

Predatory publishing is undoubtedly a global phenomenon, but with unique characteristics in different countries. In this week’s blog, Simon Linacre shares insight from Russia and a group of researchers keen to shine the spotlight on breaches in publication ethics from within their own country.

For many of us, the Summer months are usually filled with holidays, conferences and a less than serious new agenda. The so-called ‘silly season’ could reliably be identified by news stories of the cat-stuck-in-tree variety, signaling to us all that there was nothing going on and we could safely ignore the news and concentrate on more important things, such as what cocktail to order next from the poolside bar.

Not anymore.

It is hard to put a finger on it, but since around 2015 the Summer months seem to have been populated almost exclusively by epoch-making events from which it is impossible to escape. Whether it’s Brexit, COVID or any number of natural or man-made disasters, the news cycle almost seems to go up a gear rather than start to freewheel. And news stories in scholarly communications are no different. This summer saw a number of big stories, including one in Nature Index regarding alleged plagiarism and article publications in predatory journals by Russian university officials. Intrigued, I contacted the research group behind the investigation to learn more.

The group in question is the Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences, and earlier this year they compiled what they claimed to be the first evidence of large-scale ‘translation plagiarism’ by Russian authors in English-language journals (“Predatory Journals at Scopus and WoS:  Translation Plagiarism from Russian Sources” (2020). Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences in collaboration with Anna A. Abalkina, Alexei S. Kassian, Larisa G. Melikhova). In total, the Commission said it had detected 94 predatory journals with259 articles from Russian authors, many of which were plagiarised after being translated from Russian into English.

In addition, the study saw that over 1,100 Russian authors had put their names to translated articles which were published in predatory journals. These included heads of departments at Russian universities, and in the case of three authors over 100 publications each in predatory journals. The report (the original can be found here) was authored by some of the founders of Dissernet, a Russian research project which is developing a database of mainly Russian journals which publish plagiarised articles or violate some other criteria of publication ethics. They are concerned that the existence of paper mills in Russia that spam authors and offer easy publication in journals is leading to wide-ranging breaches of publication ethics, supported by inflated metrics appearing to lend some legitimacy to them. Cabells hopes to be able to support the work of Dissernet in highlighting the problem in Russia and internationally, so watch this space.

How do you know you can trust a journal?

As many readers know, this week is Peer Review Week, the annual opportunity for those involved in scholarly communication and research to celebrate and learn about all aspects of peer review. As part of this conversation, Simon Linacre reflects on this year’s theme of ‘Trust in Peer Review’ in terms of the important role of peer review in the validation of scholarship, and dangers of predatory behaviour in its absence.


I was asked to deliver a webinar recently to a community of scholars in Eastern Europe and, as always with webinars, I was very worried about the Q&A section at the end. When you deliver a talk in person, you can tell by looking at the crowd what is likely to happen at the end of the presentation and can prepare yourself. A quiet group of people means you may have to ask yourself some pretty tough questions, as no one will put their hand up at the end to ask you anything; a rowdy crowd is likely to throw anything and everything at you. With a webinar, there are no cues, and as such, it can be particularly nerve-shredding.

With the webinar in question, I waited a while for a question and was starting to prepare my quiet crowd response, when a single question popped up in the chat box:

How do you know you can trust a journal?

As with all the best questions, this floored me for a while. How do you know? The usual things flashed across my mind: reputation, whether it’s published known scholars in its field, whether it is indexed by Cabells or other databases, etc. But suddenly the word trust felt a lot more personal than simply a tick box exercise to confirm a journal’s standing. That may confirm it is trustworthy but is that the same as the feeling an individual has when they really trust something or someone?

The issue of trust is often the unsaid part of the global debates that are raging currently, whether it is responses to the coronavirus epidemic, climate change or democracy. Politicians, as always, want the people to trust them; but increasingly their actions seem to be making that trust harder and harder. As I write, the UK put its two top scientists in front of the cameras to give a grave warning about COVID-19 and a second wave of cases. The fact there was no senior politician to join them was highly symbolic.

It is with this background that the choice of the theme Trust in Peer Review is an appropriate one for Peer Review Week (full disclosure: I have recently joined one of the PRW committees to support the initiative). There is a huge groundswell of support by publishers, editors and academics to support both the effectiveness of peer review and the unsung heroes who do the job for little recognition or reward. The absence of which would have profound implications for research and society as a whole.

Which brings me to the answer to the question posed above, which is to ask the opposite: how do you know when you cannot trust a journal? This is easier to answer as you can point to all those characteristics and behaviours that you would want in a journal. We see on a daily basis with our work on Predatory Reports how the absence of crucial aspects of a journal’s workings can cause huge problems for authors. No listed editor, a fake editorial board, a borrowed ISSN, a hijacked journal identity, a made-up impact factor, and – above all – false promises of a robust peer review process. Trust in peer review may require some research on the part of the author in terms of checking the background of the journal, its publisher and its editors, and it may require you to contact the editor, editorial board members or published authors to get personal advice on publishing in that journal. But doing that work in the first place and receiving personal recommendations will build trust in peer review for any authors who have doubts – and collectively for all members of the academic community.

Know before you go

Earlier this week, the Guardian in the UK released its updated university rankings, just the latest of myriad national and international exercises in defining the “best” university. At a time when deciding to go to university is fraught with unkowns, Simon Linacre argues that critical thinking skills are more important than ever.


I’ll admit it, I love it when my own university tops any kind of ranking. The fact that I was there 25 years ago and the subjects and academics taught there are unrecognisable, is of no consequence. That the department I graduated from is the BEST in the country is something I have to tell my wife and colleagues about, despite the arched eyebrows and disdain I can feel from several thousand miles away.

What does this mean? Well, aside from the fact that I must be a walking target for alumni fundraisers, it shows that my critical faculties, if not entirely absent, are certainly being overridden by shameless pride in something I have no right to be proud about. But like your favourite football team, you can’t help pulling for them through thick and thin – when they suck you say “we’re awful!”, and when they do well you say “we’re top!”.

Who’s “we”?

However, deciding which university to go to in a year or two is not the same as choosing a football team to support. You should use every critical faculty you have and hone it until it is razor-sharp before you even think of filling in a form or visiting a campus. And that means you should learn to read a university ranking like you would read a balance sheet before investing in a company, or reviewing a journal before submitting an article. I do not believe there is anything inherently wrong in any ranking as it can provide extremely useful data on which to base a decision. But you need to know where the data came from and how it relates to the investment you are making in your future life and career. This is why we always recommend users of Cabells’ Journalytics database use other relevant data points for their individual circumstances.

This week, the Guardian published its UK university rankings for 2021, with Oxford, St Andrews and Cambridge leading the way overall (full disclosure: I attended St Andrews). Each broad subject is then broken down into separate rankings with the promise that, “unlike other league tables, the Guardian rankings are designed with students in mind.” What, other university rankings do NOT have students in mind? Straight away, the amount of spin adopted here should tell you that (a) you should be careful of other hyperbolae, and (b) you should look at other league tables to see why the Guardian would say this.

And there are plenty of tables to choose from – both nationally and internationally there are dozens of such rankings, all seeking to advise students on which university to choose. Why? Because of the 48 pages of the university guide, six are adverts. Organisations publish rankings guides to sell advertising and cement their reputation as education experts, further enhancing their opportunities to sell education-related advertising in the future. Knowing why there is spin and why these guides exist in the first place should help students understand what information is in front of them and ensure a better decision-making process.

But students should also dig deep into the data. In the “Business, management and marketing” subject ranking, readers are told that, “most universities will boast of having good links with business,” that “group work is a key part of many courses” and “there will also be a practical element to assessment.” But none of these points are addressed in the rankings, which include data on criteria such as course and teaching satisfaction, spend per student and “career after 15 months.” All of this information is relevant but only some has data to back it up.

Sitting with my 12-year-old at breakfast, he looked at the page on architecture (which he has wanted to do since the age of about seven), and decided he should go to Cambridge, UCL or Bath as the top three for that subject. None of those would be a bad choice, but neither would they be an informed one.

Special report: Assessing journal quality and legitimacy

Earlier this year Cabells engaged CIBER Research (http://ciber-research.eu/) to support its product and marketing development work. Today, in collaboration with CIBER, Simon Linacre looks at the findings and implications for scholarly communications globally.


In recent months the UK-based publishing research body CIBER has been working with Cabells to better understand the academic publishing environment both specifically in terms of Medical research publications, and more broadly with regard to the continuing problems posed by predatory journals. While the research was commissioned privately by Cabells, it was always with the understanding that much of the findings could be shared openly to enable a better understanding of these two key areas.

The report — Assessing Journal Quality and Legitimacy: An Investigation into the Experience and Views of Researchers and Intermediaries – with special reference to the Health Sector and Predatory Publishinghas been shared today on CIBER’s website and the following briefly summarizes the key findings following six months’ worth of research:

  • The team at CIBER Research was asked to investigate how researchers in the health domain went about selecting journals to publish their papers, what tools they used to help them, and what their perceptions of new scholarly communications trends were, especially in regard to predatory journals. Through a mixture of questionnaire surveys and qualitative interviews with over 500 researchers and ‘intermediaries’ (i.e. librarians and research managers), research pointed to a high degree of self-sufficiency among researchers regarding journal selection
  • While researchers tended to use tools such as information databases to aid their decision-making, intermediaries focused on sharing their own experiences and providing education and training solutions to researchers. Overall, it was notable how much of a mismatch there was between what researchers said and what intermediaries did or believed
  • The existence of so-called ‘whitelists’ were common on a national and institutional level, as were the emergence of ‘greylists’ of journals to be wary of, however, there seemed to be no list of recommended journals in Medical research areas
  • In China, alongside its huge growth in research and publication output are concerns that predatory publishing could have an impact, with one participant stating that, “More attention is being paid to the potential for predatory publishing and this includes the emergence of Blacklists and Whitelists, which are government-sponsored. However, there is not just one there are many 10 or 20 or 50 different (white)lists in place”
  • In India, the explosion of predatory publishing is perhaps the consequence of educational and research expansion and the absence of infrastructure capacity to deal with it. An additional factor could be a lack of significant impetus at a local level to establish new journals, unlike in countries such as Brazil, however, universities are not legally able to establish new titles themselves. As a result, an immature market has attempted to develop new journals to satisfy scholars’ needs which in turn has led to the rise of predatory publishing in the country
  • Predatory publishing practices seemed to be having an increased impact on mainstream publishing activities globally, with grave risk of “potentially polluting repositories and citation indexes but there seems to have been little follow through by anyone.” National bodies, publishers and funders have failed to follow through on the threat and how it may have diverted funds away from legitimate publications to those engaged in illicit activities
  • Overall, predatory publishing is being driven by publish-or-perish scenarios, particularly with early career researchers (ECRs) where authors are unaware of predatory publishers in general, or of the identity of a specific journal. However, a cynical manipulation of such journals as outlets for publications is also suspected.

 

blog image 2
‘Why do you think researchers publish in predatory journals’

 


CIBER Research is an independent group of senior academic researchers from around the world, who specialize in scholarly communications and publish widely on the topic. Their most recent projects have included studies of early career researchers, digital libraries, academic reputation and trustworthiness.

 

A case study of how bad science spreads

Fake news has been the go-to criticism of the media for some politicians, which in turn has been rejected as propaganda and fear-mongering by journalists. However, as former journalist Simon Linacre argues, the fourth estate needs to have its own house is in order first, and ensure they are not tripped up by predatory journals.


I class myself as a ‘runner’, but only in the very loosest sense. Inspired by my wife taking up running a few years ago, I decided I should exercise consistently instead of the numerous half-hearted, unsuccessful attempts I had made over the years. Three years later I have done a couple of half-marathons, run every other day, and track my performance obsessively on Strava. I have also recently started to read articles on running online, and have subscribed to the magazine Runners World. So yes, I think I may actually be a runner now.

But I’m also an experienced journalist, a huge cynic, and have a bulls*** radar the size of the Grand Canyon, so even while relaxing with my magazine I like to think I can spot fakery a mile off. And so it proved earlier this summer while reading a piece on how hill running can improve your fitness. This was music to my ears as someone who lives half-way up a valley side, but my interest was then piqued when I saw a reference to the study that formed the basis for the piece, which was to an article in the International Journal of Scientific Research. Immediately, I smelt a rat. “There is no way that is the name of a reputable, peer-reviewed journal,” I thought. And I was right.

But that wasn’t even half of the problem.

After checking Cabells’ Predatory Reports database, I found not one but TWO journals are listed on the database with that name, both with long lists of breaches of the Cabells’ criteria that facilitate the identification of predatory journals. I was still curious as to the nature of the research, as it could have been legitimate research in an illegitimate journal, or just bad research, full stop. As it turned out, neither journal had ever published any research on hill running and the benefits of increasing VO2 max. So where was the story from?

After some more digging, an article matching the details in the Runners World piece could be found in a third similarly-named journal, the International Journal of Scientific and Research Publications. The article, far from the recent breakthrough suggested in the August 2020 issue of Runners World, was actually published in August 2017 by two authors from Addis Ababa University in Ethiopia. While the science of the article seems OK, the experiment that produced the results was on just 32 people over 12 weeks, which means it really needs further validation across greater subjects to confirm its findings. Furthermore, while the journal itself was not included in Cabells’ Predatory Reports database, a review found significant failings, including unusually quick peer review processes and, more seriously, that the “owner/Editor of the journal or publisher falsely claims academic positions or qualifications”. The journal has subsequently been added to Predatory Reports, and the article itself has never been cited in the three years since publication.

Yet one question remains: how did a relatively obscure article, published in a predatory journal and that has never been cited, find its way into a news story in a leading consumer magazine? Interestingly, similar research was also quoted on MSN.com in May 2020 which also quoted the International Journal of Scientific Research, while other sites have also quoted the same research but from the International Journal of Scientific and Research Publications. It appears likely that, having been quoted online once, the same story has been doing the rounds for three years like a game of ‘Telephone,’ all based on uncited research that may not have been peer reviewed in the first place, that used a small sample size and was published in a predatory journal.

While no damage has been done here – underlying all this, it does make sense that hill running can aid overall performance – one need only need to think about the string of recent health news stories around the coronavirus to see how one unverified article could sweep like wildfire through news outlets and online. This is the danger that predatory journals pose.

They’re not doctors, but they play them on TV

Recently, while conducting investigations of suspected predatory journals, our team came across a lively candidate. At first, as is often the case, the journal in question seemed to look the part of a legitimate publication. However, after taking a closer look and reading through one of the journal’s articles (“Structural and functional brain differences in key opinion journal leaders“) it became clear that all was not as it seemed.

Neurology and Neurological Sciences: Open Access, from MedDocs Publishers, avoids a few of the more obvious red flags that indicate deceitful practices, even to neophyte researchers, but lurking just below the surface are several clear behavioral indicators common to predatory publications.

1a

With a submission date of August 22, 2018, and a publication date November 13, 2018, the timeline suggests that some sort of peer review of this article may have been carried out. A closer examination of the content makes it evident that little to no peer review actually took place. The first tip-off was the double-take inducing line in the “Material and methods” section, “To avoid gender bias, we recruited only males.” Wait, what? That’s not how that works.

It soon became clear to our team that even a rudimentary peer review process (or perhaps two minutes on Google) would have led to this article’s immediate rejection. While predatory journals are no laughing matter, especially when it comes to medical research in the time of a worldwide pandemic, it is hard not to get a chuckle from some of the “easter eggs” found within articles intended to expose predatory journals. Some of our favorites from this article:

  • Frasier Crane, a listed author, is the name of the psychiatrist from the popular sitcoms Cheers and Frasier
  • Another author, Alfred Bellow, is the name of the NASA psychiatrist from the TV show I Dream of Jeannie
  • Marvin Monroe is the counselor from The Simpsons
  • Katrina Cornwell is a therapist turned Starfleet officer on Star Trek: Discovery
  • Faber University is the name of the school in Animal House (Faber College in the film)
  • Orbison University, which also doesn’t exist, is likely a tribute to the late, great musician Roy Orbison

And, perhaps our favorite find and one we almost missed:

  • In the “Acknowledgments” section the authors thank “Prof Joseph Davola for his advice and assistance.” This is quite likely an homage to the Seinfeld character “Crazy Joe Davola.”

Though our team had a few laughs with this investigation, they were not long-lived as this is yet another illustration of the thousands (Predatory Reports currently lists well over 13,000 titles) of journals such as this one in operation. Outlets that publish almost (or literally) anything, usually for a fee, with no peer review or other oversight in place and with no consideration of the detrimental effect it may have on science and research.

MedDocs PR card
Predatory Reports listing for Neurology and Neurological Sciences: Open Access

A more nuanced issue that deceptive publications create involves citations. If this was legitimate research, the included citations would not ‘count’ or be picked up anywhere since this journal is not indexed in any citation databases. Furthermore, any citation in a predatory journal that cites a legitimate journal is ‘wasted’ as the legitimate journal cannot count or use that citation appropriately as a foundation for its legitimacy. However, these citations could be counted via Google Scholar, although (thankfully) this journal has zero. Citation ‘leakage’ can also occur, where a legitimate journal’s articles cite predatory journals, effectively ‘leaking’ those citations out of the illegitimate scholarly publishing sphere into legitimate areas. These practices can have the effect of skewing citation metrics which are measures often relied upon (sometimes exclusively, often too heavily) to gauge the legitimacy and impact of academic journals.

When all is said and done, as this “study” concludes, “the importance of carefully selecting journals when considering the submission of manuscripts,” cannot be overstated. While there is some debate around the use of “sting” articles such as this one to expose predatory publications, not having them exposed at all is far more dangerous.

Right path, wrong journey

In his latest post, Simon Linacre reviews the book, The Business of Scholarly Publishing: Managing in Turbulent Timesby Albert N. Greco, Professor of Marketing at Fordham University’s Gabelli School of Business, recently published by Oxford University Press.


Given the current backdrop for all industries, one might say that scholarly communications is in more turmoil than most. With the threat to the commercial model of subscriptions posed by increasing use of Open Access options by authors, as well as the depressed book market and recent closures of university presses, the last thing anyone needs in this particular industry is the increased uncertainty brought about by the coronavirus epidemic.

As such, a book looking back at where the scholarly communications industry has come from and an appraisal of where it is now and how it should pivot to remain relevant in the future would seem like a worthwhile enterprise. Just such a book, The Business of Scholarly Publishing: Managing in Turbulent Times, has recently been written by Albert N. Greco, a U.S. professor of marketing who aims to “turn a critical eye to the product, price, placement, promotion, and costs of scholarly books and journals with a primary emphasis on the trajectory over the last ten years.”

However, in addition to this critical eye, the book needs a more practical look at how the industry has been shaken up in the last 25 years or so. It is difficult to imagine either an experienced academic librarian or industry professional advised on the direction of the book, as it has a real blind spot when it comes to some of the major issues impacting the industry today.

The first of these historical misses is a failure to mention Robert Maxwell and his acquisition of Pergamon Press in the early 1950s. Over the next two decades the books and journals publisher saw huge increases in revenues and volumes of titles, establishing a business model of rapid growth using high year-on-year price increases for must-have titles that many argue persists to this day.

The second blind spot is around Open Access (OA). This subject is covered, although not in the detail one would like given its importance to the journal publishing industry in 2020. While one cannot blame the author for missing the evolving story around Plan S, Big Deal cancellations and other OA-related stories, one might expect more background on exactly how OA started life, what the first OA journals were, the variety of declarations around the turn of the Millennium, and how technology enabled OA to become the dominant paradigm in subject areas.

This misstep may be due to the overall slight bias towards books we find in the text, and indeed the emerging issues around OA books are well covered. There are also extremely comprehensive deep dives into publishing finances and trends since 200 that mean that the book does provide a worthy companion to any academic study of publishing from 2000 to 2016.

And this brings us to the third missing element, which is the lack of appreciation of new entrants and new forms in scholarly publishing. For example, there is no mention of F1000 and post-publication peer review, little on the establishment of preprint servers or institutional repositories, and nothing on OA-only publishers such as Frontiers and Hindawi.

As a result, the book is simply a (very) academic study of some publishing trends in the 2000s and 2010s, and like much academic research is both redundant and irrelevant for those practicing in the industry. This is typified in a promising final chapter that seeks to offer “new business strategies in scholarly publishing” by suggesting that short scholarly books, and data and library publishing programs should be examined, without acknowledging that all of these already exist.


The Business of Scholarly Publishing: Managing in Turbulent Times, by Albert N. Greco  (published April 28, 2020, OUP USA) ISBN: 978-0190626235.

Cabells’ top 7 palpable points about predatory publishing practices

In his latest post, Simon Linacre looks at some new stats collated from the Cabells Predatory Reports database that should help inform and educate researchers, better equipping them to evade the clutches of predatory journals.


In recent weeks Cabells has been delighted to work with both The Economist and Nature Index to highlight some of the major issues for scholarly communication that predatory publishing practices represent. As part of the research for both pieces, a number of facts have been uncovered that not only help us understand the issues inherent in this malpractice much better, but should also point researchers away from some of the sadly typical behaviors we have come to expect.

So, for your perusing pleasure, here are Cabells’ Top 7 Palpable Points about Predatory Publishing Practices:

  1. There are now 13,500 predatory journals listed in the Predatory Reports database, which is currently growing by approximately 2,000 journals a year
  2. Over 4,300 journals claim to publish articles in the medical field (this includes multidisciplinary journals) – that’s a third of the journals in Predatory Reports. By discipline, medical and biological sciences have many more predatory journals than other disciplines
  3. Almost 700 journals in Predatory Reports start with ‘British’ (5.2%), while just 50 do on the Journalytics database (0.4%). Predatory journals often call themselves American, British or European to appear well established and legitimate, when in reality relatively few good quality journals have countries or regions in their titles
  4. There are over 5,300 journals listed in Predatory Reports with an ISSN (40%), although many of these are copied, faked, or simply made up. Having an ISSN is not a guarantee of legitimacy for journals
  5. Around 41% of Predatory Reports journals are based in the US, purport to be from the US, or are suspected of being from the US, based on information on journal websites and Cabells’ investigations. This is the highest count for any country, but only a fraction will really have their base in North America
  6. The average predatory journal publishes about 50 articles a year according to recent research from Bo-Christer Björk of the Hanken School of Economics in Helsinki, less than half the output of a legitimate title. Furthermore, around 60% of papers in such journals receive no future citations, compared with 10% of those in reliable ones
  7. Finally, it is worth noting that while we are in the throes of the Coronavirus pandemic, there are 41 journals listed on in Predatory Reports (0.3%) specifically focused on epidemiology and another 35 on virology (0.6% in total). There could be further growth over the next 12 months, so researchers in these areas should be particularly careful now about where they submit their papers.