The RAS Commission for Counteracting the Falsification of Scientific Research

Predatory publishing is undoubtedly a global phenomenon, but with unique characteristics in different countries. In this week’s blog, Simon Linacre shares insight from Russia and a group of researchers keen to shine the spotlight on breaches in publication ethics from within their own country.

For many of us, the Summer months are usually filled with holidays, conferences and a less than serious new agenda. The so-called ‘silly season’ could reliably be identified by news stories of the cat-stuck-in-tree variety, signaling to us all that there was nothing going on and we could safely ignore the news and concentrate on more important things, such as what cocktail to order next from the poolside bar.

Not anymore.

It is hard to put a finger on it, but since around 2015 the Summer months seem to have been populated almost exclusively by epoch-making events from which it is impossible to escape. Whether it’s Brexit, COVID or any number of natural or man-made disasters, the news cycle almost seems to go up a gear rather than start to freewheel. And news stories in scholarly communications are no different. This summer saw a number of big stories, including one in Nature Index regarding alleged plagiarism and article publications in predatory journals by Russian university officials. Intrigued, I contacted the research group behind the investigation to learn more.

The group in question is the Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences, and earlier this year they compiled what they claimed to be the first evidence of large-scale ‘translation plagiarism’ by Russian authors in English-language journals (“Predatory Journals at Scopus and WoS:  Translation Plagiarism from Russian Sources” (2020). Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences in collaboration with Anna A. Abalkina, Alexei S. Kassian, Larisa G. Melikhova). In total, the Commission said it had detected 94 predatory journals with259 articles from Russian authors, many of which were plagiarised after being translated from Russian into English.

In addition, the study saw that over 1,100 Russian authors had put their names to translated articles which were published in predatory journals. These included heads of departments at Russian universities, and in the case of three authors over 100 publications each in predatory journals. The report (the original can be found here) was authored by some of the founders of Dissernet, a Russian research project which is developing a database of mainly Russian journals which publish plagiarised articles or violate some other criteria of publication ethics. They are concerned that the existence of paper mills in Russia that spam authors and offer easy publication in journals is leading to wide-ranging breaches of publication ethics, supported by inflated metrics appearing to lend some legitimacy to them. Cabells hopes to be able to support the work of Dissernet in highlighting the problem in Russia and internationally, so watch this space.

A case study of how bad science spreads

Fake news has been the go-to criticism of the media for some politicians, which in turn has been rejected as propaganda and fear-mongering by journalists. However, as former journalist Simon Linacre argues, the fourth estate needs to have its own house is in order first, and ensure they are not tripped up by predatory journals.


I class myself as a ‘runner’, but only in the very loosest sense. Inspired by my wife taking up running a few years ago, I decided I should exercise consistently instead of the numerous half-hearted, unsuccessful attempts I had made over the years. Three years later I have done a couple of half-marathons, run every other day, and track my performance obsessively on Strava. I have also recently started to read articles on running online, and have subscribed to the magazine Runners World. So yes, I think I may actually be a runner now.

But I’m also an experienced journalist, a huge cynic, and have a bulls*** radar the size of the Grand Canyon, so even while relaxing with my magazine I like to think I can spot fakery a mile off. And so it proved earlier this summer while reading a piece on how hill running can improve your fitness. This was music to my ears as someone who lives half-way up a valley side, but my interest was then piqued when I saw a reference to the study that formed the basis for the piece, which was to an article in the International Journal of Scientific Research. Immediately, I smelt a rat. “There is no way that is the name of a reputable, peer-reviewed journal,” I thought. And I was right.

But that wasn’t even half of the problem.

After checking Cabells’ Predatory Reports database, I found not one but TWO journals are listed on the database with that name, both with long lists of breaches of the Cabells’ criteria that facilitate the identification of predatory journals. I was still curious as to the nature of the research, as it could have been legitimate research in an illegitimate journal, or just bad research, full stop. As it turned out, neither journal had ever published any research on hill running and the benefits of increasing VO2 max. So where was the story from?

After some more digging, an article matching the details in the Runners World piece could be found in a third similarly-named journal, the International Journal of Scientific and Research Publications. The article, far from the recent breakthrough suggested in the August 2020 issue of Runners World, was actually published in August 2017 by two authors from Addis Ababa University in Ethiopia. While the science of the article seems OK, the experiment that produced the results was on just 32 people over 12 weeks, which means it really needs further validation across greater subjects to confirm its findings. Furthermore, while the journal itself was not included in Cabells’ Predatory Reports database, a review found significant failings, including unusually quick peer review processes and, more seriously, that the “owner/Editor of the journal or publisher falsely claims academic positions or qualifications”. The journal has subsequently been added to Predatory Reports, and the article itself has never been cited in the three years since publication.

Yet one question remains: how did a relatively obscure article, published in a predatory journal and that has never been cited, find its way into a news story in a leading consumer magazine? Interestingly, similar research was also quoted on MSN.com in May 2020 which also quoted the International Journal of Scientific Research, while other sites have also quoted the same research but from the International Journal of Scientific and Research Publications. It appears likely that, having been quoted online once, the same story has been doing the rounds for three years like a game of ‘Telephone,’ all based on uncited research that may not have been peer reviewed in the first place, that used a small sample size and was published in a predatory journal.

While no damage has been done here – underlying all this, it does make sense that hill running can aid overall performance – one need only need to think about the string of recent health news stories around the coronavirus to see how one unverified article could sweep like wildfire through news outlets and online. This is the danger that predatory journals pose.

No time for rest

This week The Economist published an article on predatory publishing following collaboration with Cabells. Simon Linacre looks into the findings and points to how a focus on education can avert a disaster for Covid-19 and other important research.


One of the consequences of the all-consuming global interest in the coronavirus pandemic is that it has catapulted science and scientists right onto the front pages and into the public’s range of vision. For the most part, this should be a good thing, as there quite rightly has to be a respect and focus on what the facts say about one of the most widespread viruses there has ever been. However, there have been some moments where science itself has been undermined by some of the rather complex structures that support it. And like it or not, scholarly communication is one of them.

Let’s take the perspective of, say, a mother who is worried about the safety of her kids when they go back to school. Understandably, she starts to look online and in the media for what the science says, as many governments have sought to quell fears people have by saying they are ‘following the science’. But once online, they are faced with a tangled forest of articles, journals, jargon, paywalls and small print, with the media seemingly supporting contradictory statements depending on the newspaper or website you read. For example, this week’s UK newspapers have led on how the reduction of social distancing from 2m to 1m can double the infection rate, or be many times better than having no social distancing – both factually accurate and from the same peer reviewed study in The Lancet.

Another area that has seen a good deal of coverage has been preprints, and how they can speed up the dissemination of science… or have the capability of disseminating false data and findings due to lack of peer review, again depending on where you cast your eye. The concerns represented by media bias, the complexity of information and lack of peer review all combine into one huge problem that could be coming down the line very soon, and that is the prospect of predatory journals publishing erroneous, untested information as research in one of the thousands of predatory journals currently active.

This week Cabells collaborated with The Economist to focus some of these issues, highlighting that:

  • Around a third of journals on both the Cabells Journal Whitelist and Blacklist focus on health, with predatory journals in subjects such as maths and physics number more than legitimate journals
  • Geography plays a significant role, with many more English language predatory journals based in India and Nigeria than reliable ones
  • The average output of a predatory journal is 50 articles a year, although 60% of these will never be cited (compared to 10% for legitimate journals)
  • Despite the like of peer review or any of the usual publishing checks, an estimated 250,000 articles each year are cited in other journals
  • Most common severe behaviors (which are liable to lead to inclusion in the Blacklist) are articles missing from issues or archives, lack of editor or editorial board on the website, and journals claiming misleading metrics or inclusion in well-known indexes.

Understandably, The Economist makes the link between so much fake or unchecked science being published and the current coronavirus threat, concluding: “Cabells’ guidelines will only start to catch dodgy studies on COVID-19 once they appear in predatory journals. But the fact that so many “scholars” use such outlets means that working papers on the disease should face extra-thorough scrutiny.” We have been warned.

Doing your homework…and then some

Researchers have always known the value of doing their homework – they are probably the best there is at leaving no stone unturned. But that has to apply to the work itself. Simon Linacre looks at the importance of ‘researching your research’ and using the right sources and resources.


Depending on whether you are a glass half full or half empty kind of a person, it is either a great time for promoting the value of scientific research, or, science is seeing a crisis in confidence. On the plus side, the value placed on research to lead us out of the COVID-19 pandemic has been substantial, and rarely have scientists been so much to the fore for such an important global issue. On the other hand, there have been uprisings against lockdowns in defiance of science, and numerous cases of fake science related to the Coronavirus. Whether it is COVID-19, Brexit, or global warming, we seem to be in an age of wicked problems and polarising opinions on a global scale.

If we assume that our glass is more full than empty in these contrarian times, and try to maintain a sense of optimism, then we should be celebrating researchers and the contribution they make. But that contribution has to be validated in scientific terms, and its publication validated in such a way that users can trust in what it says. For the first part, there has been a good deal of discussion in academic circles and even in the press about the nature of preprints, and how users have to take care to understand that they may not yet have been peer reviewed, so any conclusions should not yet be taken as read.

For the second part, however, there is a concern that researchers in a hurry to publish their research may run afoul of predatory publishers, or simply publish their articles in the wrong way, in the wrong journal for the wrong reasons. This was highlighted to me when a Cabells customer alerted us to a new website called Academic Accelerator. I will leave people to make their own minds up as to the value of the site, however, a quick test using academic research on accounting (where I managed journals for over a decade, so know the area) showed that:

  • Attempting to use the ‘Journal Writer’ function for an accounting article suggested published examples from STM journals
  • Trying to use the ‘Journal Matcher’ function for an accounting article again only recommended half a dozen STM journals as a suitable destination for my research
  • Accessing data for individuals journals seems to have been crowdsourced by users, and didn’t match the actual data for many journals in the discipline.

The need for researchers to publish as quickly as possible has perhaps never been greater, and the tools and options for them to do so have arguably never been as open. However, with this comes a gap in the market that many operators may choose to exploit, and at this point, the advice for researchers is the same as ever. Always research your research – know what you are publishing and where you are publishing it, and what the impact will be both in scholarly terms and in real-world terms. In an era where working from home is the norm, there is no excuse for researchers not to do their homework on what they publish.


***REMINDER***

If you haven’t already completed our survey, there is still time to provide your opinion. Cabells is undertaking a review of the current branding for ‘The Journal Whitelist’ and ‘The Journal Blacklist’. As part of this process, we’d like to gather feedback from the research community to understand how you view these products, and which of the proposed brand names you prefer.

Our short survey should take no more than ten minutes to complete, and can be taken here.

As thanks for your time, you’ll have the option to enter into a draw to win one of four Amazon gift vouchers worth $25 (or your local equivalent). More information is available in the survey.

Many thanks in advance for your valuable feedback!

Simon Linacre

Unintended consequences: how will COVID-19 shape the future of research

What will happen to global research output during lockdowns as a result of the coronavirus?  Simon Linacre looks at how the effect in different countries and disciplines could shape the future of research and scholarly publications.


We all have a cabin fever story now after many countries have entered into varying states of lockdown. Mine is how the little things have lifted what has been quite an oppressive mood – the smell of buns baking in the oven; lying in bed that little bit longer in a morning; noticing the newly born lambs that have suddenly appeared in nearby fields. All of these would be missed during the usual helter-skelter days we experience during the week. But things are very far from usual in these coronavirus-infected days. And any distraction is a welcome one.

On a wider scale, the jury is still very much out as to how researchers are dealing with the situation, let alone how things will be affected in the future. What we do know is that in those developed countries most impacted by the virus, universities have been closed down, students sent home and labs mothballed. In some countries such as Italy there are fears important research work could be lost in the shutdown, while in the US there is concern for the welfare of those people – and animals – who are currently in the middle of clinical trials. Overall, everyone hopes that the specific research into the coronavirus yields some quick results.

On the flip side, however, for those researchers not confined to labs or field research, this period could accelerate their work. For those in social science or humanities freed from the commute, teaching commitments and office politics of daily academic life, the additional time will no doubt be put to good use. More time to set up surveys; more time for reading; more time for writing papers. Increased research output is perhaps inevitable in those areas where academics are not tied to labs or other physical experiments.

These two countervailing factors may cancel each other out, or one may prevail over the other. As such, the scholarly publishing community does not know yet what to expect down the line. In the short term, it has been focused on making related content freely accessible (such as this site from The Lancet: ). However, what we may see is that there is greater pressure to see research in potentially globally important areas to be made open access at the source given how well researchers and networks have seemed to work together so far during the short time the virus has been at large.

Again, unintended consequences could be one of the key legacies of the crisis once the virus has died down. Organizations concerned about how their people can work from home will no doubt have their fears allayed, while the positive environmental impact of less travelling will be difficult to give up. For publishers and scholars, understanding how their research could have an impact when the world is in crisis may change their research aims forever.

The future of research evaluation

Following last week’s guest post from Rick Anderson on the risks of predatory journals, we turn our attention this week to legitimate journals and the wider issue of evaluating scholars based on their publications. With this in mind, Simon Linacre recommends a broad-based approach with the goal of such activities permanently front and center.


This post was meant to be ‘coming to you LIVE from London Book Fair’, but as you may know, this event has been canceled, like so many other conferences and other public gatherings in the wake of the coronavirus outbreak. While it is sad to miss the LBF event, meetings will take place virtually or in other places, and it is to be hoped the organizers can bring it back bigger and better than ever in 2021.

Some events are still going ahead, however, in the UK, and it was my pleasure to attend the LIS-Bibliometrics Conference at the University of Leeds last week to hear the latest thinking on journal metrics and performance management for universities. The day-long event was themed ‘The Future of Research Evaluation’, and it included both longer talks from key people in the industry, and shorter ‘lightning talks’ from those implementing evaluation systems or researching their effectiveness in different ways.

There was a good deal of debate, both on the floor and on Twitter (see #LisBib20 to get a flavor), with perhaps the most interest in speaker Dr. Stephen Hill, who is Director of Research at Research England, and chair of the steering group for the 2021 Research Excellence Framework (REF) in the UK. For those of us wishing to see crumbs from his table in the shape of a steer for the next REF, we were sadly disappointed as he was giving nothing away. However, what he did say was that he saw four current trends shaping the future of research evaluation:

  • Outputs: increasingly they will be diverse, include things like software code, be more open, more collaborative, more granular and potentially interactive rather than ‘finished’
  • Insight: different ways of understanding may come into play, such as indices measuring interdisciplinarity
  • Culture: the context of research and how it is received in different communities could become explored much more
  • AI: artificial intelligence will become a bigger player both in terms of the research itself and how the research is analyzed, e.g. the Unsilo tools or so-called ‘robot reviewers’ that can remove any reviewer bias.

Rather revealingly, Dr. Hill suggested a fifth trend might be the societal impact, and this is despite the fact that such impact has been one of the defining traits of both the current and previous REFs. Perhaps, the full picture has yet to be understood regarding impact, and there is some suspicion that many academics have yet to buy-in to the idea at all. Indeed, one of the takeaways from the day was that there was little input into the discussion from academics at all, and one wonders if they might have contributed to the discussion about the future of research evaluation, as it is their research being evaluated after all.

There was also a degree of distrust among the librarians present towards publishers, and one delegate poll should be of particular interest to them as it showed what those present thought were the future threats and challenges to research evaluation. The top three threats were identified as publishers monopolizing the area, commercial ownership of evaluation data, and vendor lock-in – a result which led to a lively debate around innovation and how solutions could be developed if there was no commercial incentive in place.

It could be argued that while the UK has taken the lead on impact and been busy experimenting with the concept, the rest of the higher education world has been catching up with a number different takes on how to recognize and reward research that has a demonstrable benefit. All this means that we are yet to see the full ‘impact of impact,’ and certainly, we at Cabells are looking carefully at what potential new metrics could aid this transition. Someone at the event said that bibliometrics should be “transparent, robust and reproducible,” and this sounds like a good guiding principle for whatever research is being evaluated.

Will academia lead the way?

Universities are usually expected to have all the answers – they are full of clever people after all. But sometimes, they need some help to figure out specific problems. Simon Linacre attended a conference recently where the questions being asked of higher education are no less than solving the problems of climate change, poverty, clean water supply and over a dozen more similar issues. How can academic institutions respond?


Most people will be aware of the United Nations and the Sustainable Development Goals (SDGs), which they adopted to solve 17 of the world’s biggest problems by 2030. Solving the climate change crisis by that date has perhaps attracted the most attention, but all of the goals present significant challenges to global society.

Universities are very much at the heart of this debate, and there seems to be an expectation that because of the position they have in facilitating research, they will unlock the key to solving these major problems. And so far they seem to have taken up the challenge with some gusto, with new research centers and funding opportunities appearing all the time for those academics aiming to contribute to these global targets in some way. What seems to be missing, however, is that many academics don’t seem to have received the memo on what they should be researching.
 
Following several conversations at conferences and with senior management at a number of universities, the two themes that are repeated when it comes to existing research programs is that there is a problem with both ‘culture and capabilities’. By culture, university hierarchies report that their faculty members are still as curious and keen to do research as ever, but they are not as interested when they are told to focus their energies on certain topics. And when they do, they lack the motivation or incentives to ensure the outcomes of their research lie in real-world impact. For the academic, impact still means a smallish number with three decimal places – ie, the Impact Factor.

In addition, when it comes to the capability of undertaking the kind of research that is likely to contribute to moving forward the SDGs, academics have not had any training, guidance, or support in what to do. In the UK, for example, where understanding and exhibiting impact is further forward than anywhere else in the world thanks to the Research Excellence Framework (REF), there still seem to be major issues with academics being focused on research that will get published rather than research that will change things. In one conversation, while I was referring to research outcomes as real-world benefits, an academic was talking about the quality of journals in which research would be published. Both are legitimate research outcomes, but publication is still way ahead in terms of cultural expectations. And internal incentives are in reality far behind the overarching aims stated by governments and research organizations.

Perhaps we are being too optimistic to expect the grinding gears of academia to move more smoothly towards a major culture change, and perhaps the small gains that are being made and the work done in the public space by the likes of Greta Thunberg will ultimately be enough to enable real change. But when the stakes are so high and the benefits are so great, maybe our expectations should weigh heavily on academia, as they are probably the people best placed to solve the world’s problems after all.

GBSN: Measuring the Impact of Business Schools

Business schools and the MBAs they teach have been reinvented on a regular basis almost since they began life early in the 20th century. However, Simon Linacre suggests that as the Global Business School Network meets for its annual conference in Lisbon this week, calls for a new approach might just be followed through


Another day, another business school conference. As a veteran of at least a dozen or so such events, then it is hard not to be a little cynical when reading about the conference theme set out on the website. Business schools need to change? Check. New programs being promoted? Check. Social running club at 7am on the first morning? Oh, that’s actually quite different.

The Global Business School Network (GBSN) is quite different. With a mission to “partner with business schools, industry, foundations, and aid agencies to improve access to quality, locally relevant management education for the developing world”, it’s focus is very much on a sustainable future rather than on shiny new MBAs for the privileged few who can afford them. As such, the theme of ‘Measuring the Impact of Business Schools’ is more than simply an on-trend marketing slogan, but rather a statement of purpose.

But despite its undoubted sincerity, can such an objective be achieved? The reason it just might is that it is very much aligned with a changing mood in business education. A recent report in The Economist referred to the development of a ‘New Capitalism’, where greed is no longer good and sustainability the imperative rather than simply growth. Evidence can be seen not just in the numerous business school deans being quoted in the piece, but in wider moves such as the New Zealand Prime Minister Jacinda Ardern’s recent pivot to adopt the Happiness Index as a metric for national development. The times they are a-changin’, as someone once said.

Ultimately, such changes may be as much to do with the bottom line rather than more altruistic motives. Recruitment in the US to MBAs is down, with students apparently becoming more demanding when it comes to what is being taught, with a focus on sustainability and wider impact at the top of the list. The mantra ‘doing well by doing good’ is not a new one, but perhaps we are entering an era where that shifts from just another strapline to becoming a true aphorism for change.

Cabells is supporting the GBSN event by hosting a session on research Impact for the Developing World. There are no preconceived ideas or solutions, just that the existing notions of impact are changing, and that each school needs to be laser-focused on investing in impact in the most relevant way for its own mission and purpose. Whatever business schools can therefore learn about measuring their impact will mean that the conference’s theme actually means something for once.

From Lisbon to Charleston, Cabells has you covered

This week, Cabells is fortunate enough to connect with colleagues and friends, new and old, across the globe in Lisbon, Portugal at the GBSN 2019 Annual Conference, and in Charleston, South Carolina at the annual Charleston Conference. We relish these opportunities to share our experiences and learn from others, and both conference agendas feature industry leaders hosting impactful sessions covering myriad thought-provoking topics. 

At the GBSN conference in Lisbon, Simon Linacre, Cabells Director of International Marketing and Development, is co-leading the workshop, “Research Impact for the Developing World” which explores ideas to make research more impactful and relevant in local contexts. At the heart of the matter is the notion that unless the global business community is more thoughtful and proactive about the development of research models, opportunities for positively impacting business and management in the growth markets of the future will be lost. We know all in attendance will benefit from Simon’s insights and leadership in working through this important topic.

gbsn

At the Charleston Conference, a lively and eventful day at the vendor showcase on Tuesday was enjoyed by all and our team was reminded once again how wonderful it is to be a part of the scholarly community. We never take for granted how fortunate we are to have the opportunity to share, learn, and laugh with fellow attendees. 

 

We are always excited to pass along news on the projects we are working on, learn about what we might expect down the road, and consider areas we should focus on going forward. Hearing what is on the collective mind of academia and how we can help move the community forward is what keeps us going. And things are just getting started! With so many important and interesting sessions on the agenda in Charleston, our only regret is that we can’t attend them all!

Cabells renews partnership with CLOCKSS to further shared goals of supporting scholarly research

Cabells is excited to announce the renewal of its partnership with CLOCKSS, the decentralized preservation archive that ensures the long-term survival of scholarly content in digital format. Cabells is pleased to provide complimentary access to the Journal Whitelist and Journal Blacklist databases for an additional two years to CLOCKSS, to further the organizations’ shared goals of supporting and preserving scholarly publications for the benefit of the global research community.

The goal of Cabells is to provide academics with the intelligent data needed for comprehensive journal evaluations to safeguard scholarly communication and advance the dissemination of high-value research.  Assisting CLOCKSS in its mission to provide secure and sustainable archives for the preservation of academic publications in their original format is a logical and rewarding collaboration.

“We are proud to renew our partnership with CLOCKSS. Our mission to protect the integrity of scholarly communication goes hand in hand with their work to ensure the secure and lasting preservation of scholarly research,” said Kathleen Berryman, Director of Business Relations with Cabells.

In helping to protect and preserve academic research, Cabells and CLOCKSS are fortunate to play vital roles in the continued prosperity of the scholarly community.


 

About: Cabells – Since its founding over 40 years ago, Cabells services have grown to include both the Journal Whitelist and the Journal Blacklist, manuscript preparation tools, and a suite of powerful metrics designed to help users find the right journals, no matter the stage of their career. The searchable Journal Whitelist database includes 18 academic disciplines from more than 11,000 international scholarly publications. The Journal Blacklist is the only searchable database of predatory journals, complete with detailed violation reports. Through continued partnerships with major academic publishers, journal editors, scholarly societies, accreditation agencies, and other independent databases, Cabells provides accurate, up-to-date information about academic journals to more than 750 universities worldwide. To learn more, visit www.cabells.com.

About: CLOCKSS is a not-for-profit joint venture between the world’s leading academic publishers and research libraries whose mission is to build a sustainable, international, and geographically distributed dark archive with which to ensure the long-term survival of Web-based scholarly publications for the benefit of the greater global research community.