What really counts for rankings?

University and business school rankings have induced hate and ridicule in equal measure since they were first developed, and yet we are told enjoy huge popularity with students. Simon Linacre looks at how the status quo could change thanks in part to some rankings’ own shortcomings.


In a story earlier this month in Times Higher Education (THE), it was reported that the status of a university vis-à-vis sustainability was now the primary consideration for international students, ahead of academic reputation, location, job prospects and even accessibility for Uber Eats deliveries – OK, maybe not the last one. But for those who think students should place such considerations at the top of their lists, this was indeed one of those rare things in higher ed in recent times: a good news story.

But how do students choose such a university? Amazingly, THE produced a ranking just a week later providing students with, you guessed it, a ranking of universities based on their sustainability credentials. Aligned with the UN’s now-ubiquitous Sustainability Development Goals (SDGs), the ranking is now well-established and this year proclaimed the University of Manchester in the UK as the number one university that had the highest impact ranking across all 17 SDGs, although it was somewhat of an outlier for the UK, with four of the top ten universities based in Australia.

Cynics may point out that such rankings have become an essential part of the marketing mix for outfits such as THE, the Financial Times and QS. Indeed the latter has faced allegations this week over possible conflicts of interest between its consulting arm and its rankings with regard to universities in Russia – a charge which QS denies. However, perhaps most concerning is the imbalance that has always existed between the importance placed on rankings by institutions and the transparency and/or relevance of the rankings themselves. A perpetual case of the tail wagging the dog.

Take, for instance, the list of 50 journals used by the Financial Times as the basis for one of its numerous criteria for assessing business schools for its annual rankings. The list is currently under review after not changing since 2016, and then it only added 5 journals from the 45 it used prior to that date, which was itself an upgrade from 40 used in the 2000s. In other words, despite the massive changes seen in business and business education – from Enron to the global financial crisis to globalisation to the COVID pandemic – there has been barely any change in the journals used to assess publications from business schools to determine whether they are high quality.

The FT’s Global Education Editor Andrew Jack was questioned about the relevance of the FT50 and the rankings in general in Davos in 2020, and answered that to change the criteria would endanger the comparability of the rankings. This intransigence by the FT and other actors in higher education and scholarly communications was in part the motivation behind Cabells’ pilot study with the Haub School of Business at St Joseph’s University in the US to create a new rating based on journals’ output intensity in terms of the SDGs. Maintaining the status quo also reinforces paradigms and restricts diversity, marginalizing those in vulnerable and alternative environments.

If students and authors want information on SDGs and sustainability to make their education choices, it is beholden on the industry to try and supply it in as many ways as possible. And not to worry about how well the numbers stack up compared to a world we left behind a long time ago. A world that some agencies seem to want to cling on to despite evident shortcomings.

Opening up the SDGs

While the United Nations Sustainable Development Goals (SDGs) offer a framework for global communities to tackle the world’s biggest challenges, there are still huge barriers to overcome in ensuring research follows the desired path. This week, Simon Linacre reflects on the ‘push’ and ‘pull’ effects in publishing and one organization trying to refine a fragmented infrastructure.

Recently, Cabells has been able to further its commitment to pursuing the UN SDGs by signing up to the SDG Publishers Compact and sharing details of its pilot journal rating system with the Haub School of Business at Saint Joseph’s University that assesses journals in terms of their relevance to the SDGs. Part of the reason Cabells is working with the SDGs – aside from a simple belief that they are a force for good – is that they represent an opportunity to offer reward and recognition for researchers who are using their talents to in some small way make the world a better place.

The scholarly communications industry, like many others, relies on push and pull dynamics to maintain its growth trajectory. The push elements include the availability of citations and other metrics to judge performance, recognition for publishing in certain journals, and various community rewards for well-received research. On the flip side, pull elements include opportunities shared by academic publishers, a facility to record research achievements, and an opportunity to share findings globally. This is how the publishing world turns round.

This dynamic also helps to explain why potentially disruptive developments – such as Open Access or non-peer-reviewed journals and platforms – may fail to gain initial traction, and why they may require additional support in order to become embedded with academics and their mode of operations. Going back to the SDGs, we can see how their emergence could similarly be stymied by the existing power play in scholarly publishing – where are the push and pull factors guiding researchers to focus on SDG-related subjects?

I recently spoke to Stephanie Dawson, CEO at ScienceOpen, which is a discovery platform that seeks to enable academics to enhance their research in an open access environment and offer publishers ‘context building services’ to improve the impact of their outputs. ScienceOpen is very much involved with the UN SDGs, recently creating a number of content containers for SDG-related articles. By offering curative opportunities, post-publication enhancements, and article-level data services, ScienceOpen is most definitely doing its part to support a pull strategy in the industry.

Stephanie says, “We began this project working with the University College London (UCL) Library to showcase their outputs around the UN SDGs. Because we believe there needs to be broad community buy-in, we also wanted to encourage researchers globally to highlight their contributions to the Sustainable Development Goals by adding keywords and author summaries on ScienceOpen, regardless of the journal they published in and demanding publisher engagement for new works.”

And this is what Cabells is also trying to achieve – by offering new metrics that can be used to guide authors to the optimal publishing option (push) and highlighting traditionally overlooked journals with low citations as destination publications (pull), we hope we can change the conversation from ‘Is this a good journal?’ to ‘Does this research matter?’. And we think reframing the context like ScienceOpen is doing is an important first step.

Cabells launches new SDG Impact Intensity™ journal rating system in partnership with Saint Joseph’s University’s Haub School of Business

Following hot on the heels of Cabells’ inclusion in the United Nations SDG Publishers Compact, we are also announcing an exclusive partnership with Saint Joseph’s University (SJU) for a new metric assessing journals and their engagement with the UN’s Sustainable Development Goals (SDGs). Simon Linacre explains the origins of the collaboration and how the new metric could help researchers, funders, and universities alike.

If you can remember way back to the halcyon days when we went to academic conferences, you will remember one of the many benefits we enjoyed was to meet a kindred spirit, someone who shared your thoughts and ideas and looked forward to seeing again at another event. These international friendships also had the benefit of enabling you to develop something meaningful with your work, and went some way to justifying the time and expense the trips often entailed.

I was lucky enough to have one such encounter at the GBSN annual conference in Lisbon, Portugal at the end of 2019 when I met professor David Steingard from Saint Joseph’s University in the US. He was at the event to present some of the work he had been doing at SJU on its SDG Dashboard – an interactive visualization and data analytics tool demonstrating how university programmes align with the 17 SDGs. At the gala dinner I sought Dr. Steingard out and asked him something that had been buzzing inside my head ever since I heard him speak:

What if we applied your SDG reporting methodology to journals?

An animated conversation then followed, which continued on the bus home to the hotel, at the conference the next day and ultimately to the lobby of a swanky hotel in Davos (there are no other kinds of hotels there, to be honest) a year ago. From then on, small teams at SJU and Cabells have been working on a methodology for analysing and assessing the extent to which a journal has engaged with the UN’s SDGs through the articles it has published over time. This has resulted in the new metric we are releasing shortly – SDG Impact Intensity™ – the first academic journal rating system for evaluating how journals contribute to positively impacting the SDGs.

Using data collated from Cabells’ Journalytics database and running it through SJU’s AI-based methodology for identifying SDG relevance, SDG Impact Intensityprovides a rating of up to three ‘SDG rings’ to summarise the SDG relevance of articles published in the journals over a five-year period (2016-2020). For the first pilot phase of development, we chose 50 of the most storied business and management journals used for the Financial Times Global MBA ranking as well as 50 of the most dynamic journals from Cabells’ Journalytics database focused on sustainability, ethics, public policy and environmental management.

It may come as no surprise to learn that the so-called top journals lagged way behind their counterparts when it came to their levels of SDG focus. For example, none of the top 26 journals in the pilot phase are from the FT50, and only four of the top ten are from the world’s five biggest academic publishers. In contrast, the journals traditionally ranked at the very top of management journal rankings from the past 50 years in disciplines such as marketing, accounting, finance and management languish at bottom of the pilot phase ratings. While these results are hardly surprising, it perhaps shows that while governments, funders and society as a whole have started to embrace the SDGs, this has yet to filter through to what has been published in journals traditionally regarded as high impact. There has long been criticism that such titles have been favoured by business school management structures over more innovative, real-world relevant journals, and this very much seems to be borne out by the results of Cabells’ research with SJU. The very notion of what academic journal “quality” means is fundamentally challenged in light of considering how journals can make an ”impact” through engaging the SDGs.

Cabells and SJU are hoping to further their partnership and broaden their coverage of journals to enable more researchers and other interested parties to understand the type of research their target journals are publishing. With more information and greater understanding of the SDGs at hand, it is to be hoped we see a move away from a narrow, single-focus on traditional quality metrics towards a broader encouragement of research and publication that generates a positive impact on bettering the human condition and environmentally sustaining the Earth as detailed in the SDGs. In turn, we should see academia and scholarly communications play their part in ensuring the UN’s 2030 Agenda for Sustainable Development moves forward that much quicker.

Cabells and scite partner to bring Smart Citations to Journalytics

Cabells, a provider of key intelligence on academic journals for research professionals, and scite, a platform for discovering and evaluating scientific articles, are excited to announce the addition of scite’s Smart Citations to Cabells Journalytics publication summaries.

Journalytics summary card with scite Smart Citations data

Journalytics is a curated database of over 11,000 verified academic journals spanning 18 disciplines, developed to help researchers and institutions optimize decision-making around the publication of research. Journalytics summaries provide publication and submission information and citation-backed data and analytics for comprehensive evaluations.

scite’s Smart Citations allow researchers to see how articles have been cited by providing the context of the citation and a classification describing whether it provides supporting or disputing evidence for the cited claim.

The inclusion of Smart Citations adds a layer of perspective to Journalytics metrics and gives users a deeper understanding of journal activity by transforming citations from a mere number into contextual data.

Lacey Earle, executive director of Cabells, says, “Cabells is thrilled to partner with scite in order to help researchers evaluate scientific articles through an innovative, comparative-based metric system that encourages rigorous and in-depth research.”

Josh Nicholson, co-founder and CEO of scite says of the partnership, “We’re excited to be working with Cabells to embed our Smart Citations into their Journalytics summaries. Smart Citations help you assess the quantity of citations a journal has received as well as the quality of these citations, with a focus on identifying supporting and disputing citations in the literature.”


about cabells

Cabells generates actionable intelligence on academic journals for research professionals.  On the Journalytics platform, an independent, curated database of more than 11,000 verified scholarly journals, researchers draw from the intersection of expertise, data, and analytics to make confident decisions to better administer research. In Predatory Reports, Cabells has undertaken the most comprehensive and detailed campaign against predatory journals, currently reporting on deceptive behaviors of over 14,000 publications. By combining its efforts with those of researchers, academic publishers, industry organizations, and other service providers, Cabells works to create a safe, transparent and equitable publishing ecosystem that can nurture generations of knowledge and innovation. For more information please visit Cabells or follow us on Twitter, LinkedIn and Facebook.

about scite

scite is a Brooklyn-based startup that helps researchers better discover and evaluate scientific articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or disputing evidence. scite is used by researchers from dozens of countries and is funded in part by the National Science Foundation and the National Institute of Drug Abuse of the National Institutes of Health. For more information, please visit scite, follow us on Twitter, LinkedIn, and Facebook, and download our Chrome or Firefox plugin. For careers, please see our jobs page.

The fake factor

On the day that the US says goodbye to its controversial President, we cannot bid farewell to one of his lasting achievements, which is to highlight issues of fake news and misinformation. Simon Linacre looks at how putting the issue in the spotlight could at least increase people’s awareness… and asks for readers’ help to do so.

Cabells completed around a dozen webinars with Indian universities towards the end of 2020 in order to share some of our knowledge of predatory publishing, and also learn from librarians, faculty members and students what their experiences were. Studies have shown that India has both the highest number of predatory journals based there and most authors publishing in them, as well as a government as committed as any to dealing with the problem, so any insight from the region is extremely valuable.

Q&A sessions following the webinars were especially rich, with a huge range of queries and concerns raised. One specific query raised a number of issues: how can researchers know if the index a journal says it is listed in is legitimate or not? As some people will be aware, one of the tricks of the trade for predatory publishers is to promote indices their journals are listed in, which can come in several types:

  • Pure lies: These are journals that say they have an ‘Impact Factor’, but are not listed by Clarivate Analytics in its Master Journal List of titles indexed on Web of Science (and therefore have an Impact Factor unless only recently accepted)
  • Creative lies: These journals say they are listed by an index, which is true, but the index is little more than a list of journals which say they are listed by the index, with the addition of the words ‘Impact Factor’ to make it sound better (eg. ‘Global Impact Factor’ , ‘Scholarly Article Impact Factor’)
  • Nonsensical lies: These are links (or usually just images) to seemingly random words or universities that try to import some semblance of recognition, but mean nothing. For example, it may be a name of a list, service or institution, but a quick search elicits nothing relating those names with the journal
  • White lies: One of the most common, many predatory journals say they are ‘listed’ or ‘indexed’ by Google Scholar. While it is true to say these journals can be discovered by Google Scholar, they are not listed or indexed for the simple reason that GS is not a list or an index

When Jeffrey Beall was active, he included a list of ‘Misleading Metrics’ on his blog that highlighted some of these issues. A version or versions of this can still be found today, but are not linked to here because (a) they are out of date by at least four years, and (b) the term ‘misleading’ is, well, misleading as few of the indexes include metrics in the first place, and the metrics may not be the major problem with the index. However, this information is very valuable, and as such Cabells has begun its own research program to create an objective, independently verifiable and freely available list of fake indexes in 2021. And, what’s more, we need your help – if anyone would like to suggest we look into a suspicious looking journal index, please write to me at simon.linacre@cabells.com and we will review the site for inclusion.

Back to basics

As we enter what is an uncertain 2021 for many both personally and professionally, it is worth perhaps taking the opportunity to reset and refocus on what matters most to us. In his latest blog post, Simon Linacre reflects on Cabells’ new video and how it endeavors to show what makes us tick.

It is one of the ironies of modern life that we seem to take comfort in ‘doomscrolling’, that addictive pastime of flicking through Twitter on other social media on the hunt for the next scandal to inflame our ire. Whether it is Brexit, the coronavirus epidemic or alleged election shenanigans, we can’t seem to get enough of the tolls of doom ringing out in our collective echo chambers. As the New Year dawns with little good news to cheer us, we may as well go all in as the world goes to hell in a handcart.

Of course, we also like the lighter moments that social media provide, such as cat videos and epic fails. And it is comforting to hear some stories that renew our faith in humanity. One parent on Twitter remarked this week as the UK’s schools closed and reverted to online learning, that she was so proud of her child who, on hearing the news, immediately started blowing up an exercise ball with the resolve not to waste the opportunity lockdown provided of getting fit.

Reminding ourselves that the glass can be at least half full even if it looks completely empty is definitely a worthwhile exercise, even if it feels like the effort of constantly refilling it is totally overwhelming. At Cabells, our source of optimism has recently come from the launch of our new video. The aim of the video is to go back to basics and explain what Cabells does, why it does it, and how it does it through its two main products – Journalytics and Predatory Reports.

Making the video was a lot of fun, on what was a beautiful sunny Spring day in Edinburgh with one of my US colleagues at an academic conference (remember them?). While nerve-shredding and embarrassing, it was also good to go back to basics and underline why Cabells exists and what we hope to achieve through all the work we do auditing thousands of journals every year.

It also acted as a reminder that there is much to look forward to in 2021 that will keep our glasses at least half full for most of the time. Cabells will launch its new Medical journal database early this year, which will see over 5,000 Medical journals indexed alongside the 11,000 journals indexed in Journalytics. And we also have major upgrades and enhancements planned for both Journalytics and Predatory Reports databases that will help researchers, librarians and funders better analyse journal publishing activities. So, let’s raise a (half full) glass to the New Year, and focus on the light at the end of the tunnel and not the darkness that seems to surround us in early January.

Empowering India’s Academia

According to some research, India has the unfortunate distinction of having both the highest number of predatory journals based there, and the highest number of authors publishing in them. In this week’s blog, Simon Linacre answers some of the questions researchers in that country have regarding authentic publishing practices.

During the latter part of 2020, instead of jetting off to typical destinations in the scholarly communications calendar such as Frankfurt and Charleston, some of my energies have been focused on delivering a number of short webinars on predatory publishing to a variety of Indian institutions. According to an oft-quoted article by Shen and Bjork in 2015, India has the largest number of authors who have published in predatory journals, and Cabells knows from its own data that one of the most common countries to find predatory journals originating is in India.

There are probably a number of reasons that account for this, but rather than speculate it is perhaps better to try to act and support Indian authors who do not want to fall into the numerous traps laid for them. One aspect of this are the recent activities of the University Grants Commission (UGC) and its Consortium for Academic and Research Ethics (CARE), which have produced a list of recommended journals for Indian scholars to use.

However, many in India are still getting caught out as some journals have been cloned or hijacked, while others use the predatory journal tactic of simply lying about their listing by UGC-CARE, Scopus, Web of Science or Cabells in order to attract authorship. Following one webinar last week to an Indian National Institute of Technology (NIT), there was a flurry of questions from participants that deserved more time than allowed, so I thought it would be worth sharing some of these questions and my answers here so others can hopefully pick up a few tips when they are making that crucial decision to publish their research.

  1. What’s the difference between an Impact Factor and CiteScore? Well, they both look and feel the same, but the Impact Factor (IF) records a journal’s published articles over a two year period and how they have been cited in the following year in other Web of Science-indexed journals, whereas the CiteScore records a journal’s published documents over a three year period before counting citations the following year.
  2. How do I know if an Impact Factor is fake? Until recently, this was tricky, but now Clarivate Analytics has made its IFs for the journals it indexes for the previous year available on its Master Journal List.
  3. If I publish an article in a predatory journal, can the paper be removed? A submission can be made to the publisher for an article to be retracted, however a predatory publisher is very unlikely to accede to the request and will probably just ignore it. If they do respond, they have been known to ask for more money – in addition to the APC that has already been paid – to carry out the request, effectively blackmailing the author.
  4. If I publish an article in a predatory journal, can I republish it elsewhere? Sadly, dual publication is a breach of publication ethics whether the original article is published in a predatory journal or a legitimate one. The best course of action is to build on the original article with further research so that it is substantially different from the original and publish the subsequent paper in a legitimate journal.
  5. How can I tell if a journal is from India or not? If the origin of the journal is important, the information should be available on the journal’s website. It can also be checked using other sources, such as Scimago which provides country-specific data on journals.

Simon Linacre, Cabells

The RAS Commission for Counteracting the Falsification of Scientific Research

Predatory publishing is undoubtedly a global phenomenon, but with unique characteristics in different countries. In this week’s blog, Simon Linacre shares insight from Russia and a group of researchers keen to shine the spotlight on breaches in publication ethics from within their own country.

For many of us, the Summer months are usually filled with holidays, conferences and a less than serious new agenda. The so-called ‘silly season’ could reliably be identified by news stories of the cat-stuck-in-tree variety, signaling to us all that there was nothing going on and we could safely ignore the news and concentrate on more important things, such as what cocktail to order next from the poolside bar.

Not anymore.

It is hard to put a finger on it, but since around 2015 the Summer months seem to have been populated almost exclusively by epoch-making events from which it is impossible to escape. Whether it’s Brexit, COVID or any number of natural or man-made disasters, the news cycle almost seems to go up a gear rather than start to freewheel. And news stories in scholarly communications are no different. This summer saw a number of big stories, including one in Nature Index regarding alleged plagiarism and article publications in predatory journals by Russian university officials. Intrigued, I contacted the research group behind the investigation to learn more.

The group in question is the Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences, and earlier this year they compiled what they claimed to be the first evidence of large-scale ‘translation plagiarism’ by Russian authors in English-language journals (“Predatory Journals at Scopus and WoS:  Translation Plagiarism from Russian Sources” (2020). Commission for Counteracting the Falsification of Scientific Research, Russian Academy of Sciences in collaboration with Anna A. Abalkina, Alexei S. Kassian, Larisa G. Melikhova). In total, the Commission said it had detected 94 predatory journals with259 articles from Russian authors, many of which were plagiarised after being translated from Russian into English.

In addition, the study saw that over 1,100 Russian authors had put their names to translated articles which were published in predatory journals. These included heads of departments at Russian universities, and in the case of three authors over 100 publications each in predatory journals. The report (the original can be found here) was authored by some of the founders of Dissernet, a Russian research project which is developing a database of mainly Russian journals which publish plagiarised articles or violate some other criteria of publication ethics. They are concerned that the existence of paper mills in Russia that spam authors and offer easy publication in journals is leading to wide-ranging breaches of publication ethics, supported by inflated metrics appearing to lend some legitimacy to them. Cabells hopes to be able to support the work of Dissernet in highlighting the problem in Russia and internationally, so watch this space.

Know before you go

Earlier this week, the Guardian in the UK released its updated university rankings, just the latest of myriad national and international exercises in defining the “best” university. At a time when deciding to go to university is fraught with unkowns, Simon Linacre argues that critical thinking skills are more important than ever.


I’ll admit it, I love it when my own university tops any kind of ranking. The fact that I was there 25 years ago and the subjects and academics taught there are unrecognisable, is of no consequence. That the department I graduated from is the BEST in the country is something I have to tell my wife and colleagues about, despite the arched eyebrows and disdain I can feel from several thousand miles away.

What does this mean? Well, aside from the fact that I must be a walking target for alumni fundraisers, it shows that my critical faculties, if not entirely absent, are certainly being overridden by shameless pride in something I have no right to be proud about. But like your favourite football team, you can’t help pulling for them through thick and thin – when they suck you say “we’re awful!”, and when they do well you say “we’re top!”.

Who’s “we”?

However, deciding which university to go to in a year or two is not the same as choosing a football team to support. You should use every critical faculty you have and hone it until it is razor-sharp before you even think of filling in a form or visiting a campus. And that means you should learn to read a university ranking like you would read a balance sheet before investing in a company, or reviewing a journal before submitting an article. I do not believe there is anything inherently wrong in any ranking as it can provide extremely useful data on which to base a decision. But you need to know where the data came from and how it relates to the investment you are making in your future life and career. This is why we always recommend users of Cabells’ Journalytics database use other relevant data points for their individual circumstances.

This week, the Guardian published its UK university rankings for 2021, with Oxford, St Andrews and Cambridge leading the way overall (full disclosure: I attended St Andrews). Each broad subject is then broken down into separate rankings with the promise that, “unlike other league tables, the Guardian rankings are designed with students in mind.” What, other university rankings do NOT have students in mind? Straight away, the amount of spin adopted here should tell you that (a) you should be careful of other hyperbolae, and (b) you should look at other league tables to see why the Guardian would say this.

And there are plenty of tables to choose from – both nationally and internationally there are dozens of such rankings, all seeking to advise students on which university to choose. Why? Because of the 48 pages of the university guide, six are adverts. Organisations publish rankings guides to sell advertising and cement their reputation as education experts, further enhancing their opportunities to sell education-related advertising in the future. Knowing why there is spin and why these guides exist in the first place should help students understand what information is in front of them and ensure a better decision-making process.

But students should also dig deep into the data. In the “Business, management and marketing” subject ranking, readers are told that, “most universities will boast of having good links with business,” that “group work is a key part of many courses” and “there will also be a practical element to assessment.” But none of these points are addressed in the rankings, which include data on criteria such as course and teaching satisfaction, spend per student and “career after 15 months.” All of this information is relevant but only some has data to back it up.

Sitting with my 12-year-old at breakfast, he looked at the page on architecture (which he has wanted to do since the age of about seven), and decided he should go to Cambridge, UCL or Bath as the top three for that subject. None of those would be a bad choice, but neither would they be an informed one.