Cabells becomes a member of United Nations SDG Publishers Compact

Cabells is proud to announce its acceptance as a full member of the United Nations SDG Publishers Compact, becoming one of the first U.S. organizations and non-primary publishers globally to be awarded membership. Cabells joined the initiative as part of its ongoing commitment to support research and publications focused on sustainable solutions.

The SDG Publisher Compact was launched at the end of 2020 as a way to stimulate action among the scholarly communications community. It was launched in collaboration with the International Publishers Association (IPA) with the aim of speeding up progress towards the UN’s 17 Sustainable Development Goals (SDGs) by 2030.

As a signatory of the Publishers Compact, Cabells commits to developing sustainable practices and playing a key role in its networks and communities as a champion of the SDGs during what is becoming known as the ‘Decade of Action‘ from 2020–2030. As such, Cabells is developing a number of solutions designed to help identify SDG-relevant journals and research for authors, librarians, funders, and other research-focused organizations.

Cabells’ Director of International Marketing & Development, Simon Linacre, said: “The UN SDGs have already done a remarkable job in directing funding and research to the most important questions facing our planet at this time. Becoming part of the UN SDG Publishers Compact will inspire Cabells into further playing our part in meeting these grand challenges.”

For more information, visit www.cabells.com or read the UN’s original press release.

Predatory journals vs. preprints: What’s the difference?

While working towards publication in a legitimate journal, however circuitous the route, is of course a much better path than publishing in an illegitimate journal, Simon Linacre examines why this is a useful question to consider.

A blog post this week in The Geyser pointed out the problems surrounding version control of the same article on multiple preprint servers and on the F1000 platform.

TL;DR? It isn’t pretty.

The article used as an example is unquestionably a legitimate study relating to the coronavirus pandemic, and as such is a small but important piece in the jigsaw being built around science’s pandemic response. That this article has yet to be validated – and as such enabled as a piece that fits the COVID-19 jigsaw – is something that will presumably be achieved once it is published in a recognized peer-reviewed journal.

However, this does raise the following rather thorny question: how is the article any better served fragmented on different preprint servers and publishing platforms than it would be having been published as a single entity in a predatory journal?

I am being facetious here – working towards a legitimate publication, however circuitous the route, is far better than publishing in an illegitimate journal. However, comparing the two options is not as strange as one might think, and perhaps offers some guidance for authors uncertain about where to publish their research in the first place.

Firstly, early career researchers (ECRs), while often offered very little direction when it comes to publication ethics and decision-making, are understandably worried about sharing their data and findings on preprint servers for fear of being ‘scooped’ by other researchers who copy their results and get published first. This is a legitimate fear, and is one explanation why a researcher, although unfamiliar with a journal, might submit their research for a low fee and quick turnaround.

Secondly, ECRs or more experienced researchers may be incentivised by their institutions to simply achieve a publication without any checks on the type of journal they publish in. As such, they need a journal to validate their publication – even if the journal itself has not been validated – which is something preprints or non-journal platforms are unable to provide.

Finally, while recent research has shown that just over half of articles published in predatory journals do not receive any citations, just less than 50% did receive citations, and authors may prefer one sole accessible source for their research than multiple sources across different preprints. This is not to say that preprints can’t receive citations – indeed Google Scholar reveals 22 citations to the article above from its original posting on Arxiv – but the perception may be that only journals can deliver citations, and will therefore be the aim for some authors.

Of course, authors should know the very real difference between a predatory journal and a preprint, but the evidence of 14,000+ journals on Cabells Predatory Reports database and the millions of spam emails received daily from illegitimate journals points to at least some researchers falling for the same tricks and continue to line the pockets of predatory publishers. While research publishing options remain as varied and as complex as they are – and while higher education institutions and funders simply assume every researcher has an effective publishing strategy – then as many will fall into the predatory trap as they have always done.

Book review – Gaming the Metrics: Misconduct and Manipulation in Academic Research

The issues of gaming metrics and predatory publishing undoubtedly go hand-in-hand, outputs from the same system that requires academic researchers the world over to sing for their supper in some form or other. However, the two practices are often treated separately, almost as if there was no link at all, so editors Biagioli and Lippman are to be congratulated in bringing them together under the same roof in the shape of their book Gaming the Metrics: Misconduct and Manipulation in Academic Research (MIT Press, 2020).

The book is a collection of chapters that cover the whole gamut of wrongheaded – or just plain wrong – publication decisions on behalf of authors the word over on where to publish the fruits of their research. This ‘submission decision’ is unenviable, as it inevitably shapes academic careers to a greater or lesser degree. The main reason why authors make poor decisions is laid firmly at the doors of a variety of ‘publish or perish’ systems which seek to quantify the outputs from authors with a view to… well, the reason why outputs are quantified is never really explained. However, the reason why such quantification should be a non-starter is well-argued by Michael Power in Chapter 3, as well as Barbara M. Kehm (Ch. 6) in terms of the ever-popular university rankings. Even peer review comes under attack from Paul Wouters (Ch. 4), but as with the other areas any solutions are either absent, or in the case of Wouters proffered with minimal detail or real-world context.

Once into the book, any author would quickly realize that their decision to publish is fraught with difficulty with worrying about predatory publishers lurking on the internet to entice their articles and APCs from them. As such, any would be author would be well advised to heed the call ‘Caveat scriptor’ and read this book in advance of sending off their manuscript to any journals.

That said, there is also a case for advising ‘caveat lector’ before would-be authors read the book, as there are other areas where additional context would greatly help in addressing the problems of gaming metrics and academic misconduct. When it comes to predatory journals, there is a good deal of useful information included in several of the later chapters, especially the case studies in Chapters 7 and 15 which detail a suspiciously prolific Czech author and sting operation, respectively.

Indeed, these cases provide the context that is perhaps the single biggest failing of the book, which through its narrow academic lens doesn’t quite capture the wider picture of why gaming metrics and the scholarly communications system as a whole is ethically wrong, both for those who perpetrate it and arguably the architects of the systems. As with many academic texts that seek to tackle societal problems, the unwillingness to get dirt under the fingernails in the pursuit of understanding what’s really going on simply distances the reader from the problem at hand.

As a result, after reading Gaming the Metrics, one is like to simply shrug one’s shoulders in apathy about the plight of authors and their institutions, whereas a great deal more impact might have been achieved if the approach had been less academic and included more case studies and insights into the negative impact resulting from predatory publishing practices. After all, the problem with gaming the system is that, for those who suffer, it is anything but a game.

Gaming the Metrics: Misconduct and Manipulation in Academic Research, edited by Mario Biagioli and Alexandra Lippman (published Feb. 21 2020, MIT Press USA) ISBN: 978-0262537933.

Cabells and scite partner to bring Smart Citations to Journalytics

Cabells, a provider of key intelligence on academic journals for research professionals, and scite, a platform for discovering and evaluating scientific articles, are excited to announce the addition of scite’s Smart Citations to Cabells Journalytics publication summaries.

Journalytics summary card with scite Smart Citations data

Journalytics is a curated database of over 11,000 verified academic journals spanning 18 disciplines, developed to help researchers and institutions optimize decision-making around the publication of research. Journalytics summaries provide publication and submission information and citation-backed data and analytics for comprehensive evaluations.

scite’s Smart Citations allow researchers to see how articles have been cited by providing the context of the citation and a classification describing whether it provides supporting or disputing evidence for the cited claim.

The inclusion of Smart Citations adds a layer of perspective to Journalytics metrics and gives users a deeper understanding of journal activity by transforming citations from a mere number into contextual data.

Lacey Earle, executive director of Cabells, says, “Cabells is thrilled to partner with scite in order to help researchers evaluate scientific articles through an innovative, comparative-based metric system that encourages rigorous and in-depth research.”

Josh Nicholson, co-founder and CEO of scite says of the partnership, “We’re excited to be working with Cabells to embed our Smart Citations into their Journalytics summaries. Smart Citations help you assess the quantity of citations a journal has received as well as the quality of these citations, with a focus on identifying supporting and disputing citations in the literature.”


about cabells

Cabells generates actionable intelligence on academic journals for research professionals.  On the Journalytics platform, an independent, curated database of more than 11,000 verified scholarly journals, researchers draw from the intersection of expertise, data, and analytics to make confident decisions to better administer research. In Predatory Reports, Cabells has undertaken the most comprehensive and detailed campaign against predatory journals, currently reporting on deceptive behaviors of over 14,000 publications. By combining its efforts with those of researchers, academic publishers, industry organizations, and other service providers, Cabells works to create a safe, transparent and equitable publishing ecosystem that can nurture generations of knowledge and innovation. For more information please visit Cabells or follow us on Twitter, LinkedIn and Facebook.

about scite

scite is a Brooklyn-based startup that helps researchers better discover and evaluate scientific articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or disputing evidence. scite is used by researchers from dozens of countries and is funded in part by the National Science Foundation and the National Institute of Drug Abuse of the National Institutes of Health. For more information, please visit scite, follow us on Twitter, LinkedIn, and Facebook, and download our Chrome or Firefox plugin. For careers, please see our jobs page.

The fake factor

On the day that the US says goodbye to its controversial President, we cannot bid farewell to one of his lasting achievements, which is to highlight issues of fake news and misinformation. Simon Linacre looks at how putting the issue in the spotlight could at least increase people’s awareness… and asks for readers’ help to do so.

Cabells completed around a dozen webinars with Indian universities towards the end of 2020 in order to share some of our knowledge of predatory publishing, and also learn from librarians, faculty members and students what their experiences were. Studies have shown that India has both the highest number of predatory journals based there and most authors publishing in them, as well as a government as committed as any to dealing with the problem, so any insight from the region is extremely valuable.

Q&A sessions following the webinars were especially rich, with a huge range of queries and concerns raised. One specific query raised a number of issues: how can researchers know if the index a journal says it is listed in is legitimate or not? As some people will be aware, one of the tricks of the trade for predatory publishers is to promote indices their journals are listed in, which can come in several types:

  • Pure lies: These are journals that say they have an ‘Impact Factor’, but are not listed by Clarivate Analytics in its Master Journal List of titles indexed on Web of Science (and therefore have an Impact Factor unless only recently accepted)
  • Creative lies: These journals say they are listed by an index, which is true, but the index is little more than a list of journals which say they are listed by the index, with the addition of the words ‘Impact Factor’ to make it sound better (eg. ‘Global Impact Factor’ , ‘Scholarly Article Impact Factor’)
  • Nonsensical lies: These are links (or usually just images) to seemingly random words or universities that try to import some semblance of recognition, but mean nothing. For example, it may be a name of a list, service or institution, but a quick search elicits nothing relating those names with the journal
  • White lies: One of the most common, many predatory journals say they are ‘listed’ or ‘indexed’ by Google Scholar. While it is true to say these journals can be discovered by Google Scholar, they are not listed or indexed for the simple reason that GS is not a list or an index

When Jeffrey Beall was active, he included a list of ‘Misleading Metrics’ on his blog that highlighted some of these issues. A version or versions of this can still be found today, but are not linked to here because (a) they are out of date by at least four years, and (b) the term ‘misleading’ is, well, misleading as few of the indexes include metrics in the first place, and the metrics may not be the major problem with the index. However, this information is very valuable, and as such Cabells has begun its own research program to create an objective, independently verifiable and freely available list of fake indexes in 2021. And, what’s more, we need your help – if anyone would like to suggest we look into a suspicious looking journal index, please write to me at simon.linacre@cabells.com and we will review the site for inclusion.

Back to basics

As we enter what is an uncertain 2021 for many both personally and professionally, it is worth perhaps taking the opportunity to reset and refocus on what matters most to us. In his latest blog post, Simon Linacre reflects on Cabells’ new video and how it endeavors to show what makes us tick.

It is one of the ironies of modern life that we seem to take comfort in ‘doomscrolling’, that addictive pastime of flicking through Twitter on other social media on the hunt for the next scandal to inflame our ire. Whether it is Brexit, the coronavirus epidemic or alleged election shenanigans, we can’t seem to get enough of the tolls of doom ringing out in our collective echo chambers. As the New Year dawns with little good news to cheer us, we may as well go all in as the world goes to hell in a handcart.

Of course, we also like the lighter moments that social media provide, such as cat videos and epic fails. And it is comforting to hear some stories that renew our faith in humanity. One parent on Twitter remarked this week as the UK’s schools closed and reverted to online learning, that she was so proud of her child who, on hearing the news, immediately started blowing up an exercise ball with the resolve not to waste the opportunity lockdown provided of getting fit.

Reminding ourselves that the glass can be at least half full even if it looks completely empty is definitely a worthwhile exercise, even if it feels like the effort of constantly refilling it is totally overwhelming. At Cabells, our source of optimism has recently come from the launch of our new video. The aim of the video is to go back to basics and explain what Cabells does, why it does it, and how it does it through its two main products – Journalytics and Predatory Reports.

Making the video was a lot of fun, on what was a beautiful sunny Spring day in Edinburgh with one of my US colleagues at an academic conference (remember them?). While nerve-shredding and embarrassing, it was also good to go back to basics and underline why Cabells exists and what we hope to achieve through all the work we do auditing thousands of journals every year.

It also acted as a reminder that there is much to look forward to in 2021 that will keep our glasses at least half full for most of the time. Cabells will launch its new Medical journal database early this year, which will see over 5,000 Medical journals indexed alongside the 11,000 journals indexed in Journalytics. And we also have major upgrades and enhancements planned for both Journalytics and Predatory Reports databases that will help researchers, librarians and funders better analyse journal publishing activities. So, let’s raise a (half full) glass to the New Year, and focus on the light at the end of the tunnel and not the darkness that seems to surround us in early January.

Cabells and AMBA launch list of most impactful Chinese language management journals

In his last blog post in what has been a tumultuous year, Simon Linacre looks forward to a more enlightened 2021 and a new era of open collaboration and information sharing in scholarly communications and higher education.

In a year with so many monumental events, it is perhaps pointless to try and review what has happened. Everyone has lived every moment with such intensity – whether it be through 24-hour news coverage, non-stop social media or simply living life under lockdown – that it seems simply too exhausting to live through it all again. So, let’s fast forward to 2021 instead.

While some of the major concerns from 2020 will no doubt remain well into the New Year, they will also fade away gradually and be replaced by new things that will demand our attention. Difficult as it may seem now, neither Trump, Brexit (for the Brits) nor COVID will have quite the hold on the news agenda as they did, and that means there is an opportunity at least for some more positive news to start to dominate the headlines.

One activity that may succeed in this respect is the open science agenda. With a new budget agreed upon by the European Research Council and a new administration in Washington DC, together with an increasing focus more generally on open science and collaboration, it is to be hoped that there will be enough funding in place to support it. If the recent successes behind the COVID-19 vaccines show anything it is surely that focused, fast, mission-driven research can produce life-changing impacts for a huge number of people. As others have queried, what might happen if the same approach was adopted and supported for tackling climate change?

In the same vein, information sharing and data analysis should also come further to the fore in 2021. While in some quarters, consolidation and strategic partnerships will bring organisations together, in others the importance of data analysis will only become more essential in enabling evidence-based decision-making and creating competitive advantages.

In this way, the announcement today made by Cabells and the Association of MBAs and Business Graduates Association (AMBA & BGA) brings both these themes together in the shape of a new list of quality Chinese-language journals in business and management. The AMBA-Cabells Journal Report (ACJR) has been curated together by both organisations, using the indexing expertise of Cabells and the knowledge of Chinese journals at AMBA & BGA. Both organisations have been all-too-aware of the Western-centric focus of many indices and journal lists, and believe this is a positive first step towards the broadening out of knowledge and understanding of Chinese-language journals, and non-English journals more broadly.

There have also been policy changes in China during 2020 which have meant less reliance on journals with Impact Factors, and more of a push to incentivise publications in high quality local journals. As such, the ACJR should provide a valuable guide to business school authors in China about some of the top journals available to them. The journals themselves were firstly identified using a number of established Chinese sources, as well as input from esteemed scholars and deans of top business schools. Recommended journals were then checked using Google Scholar to ensure they had published consistently over the last five years and attracted high levels of citations.

The new list is very much intended to be an introduction to Chinese-language journals in business and management, and we would very much welcome input from people on the list so we can develop it further for a second iteration in 2021.

For more information on ACJR, visit https://www.associationofmbas.com/ and https://www.cabells.com/ 

What to know about ISSNs

There are many ways to skin a cat, and many ways to infer a journal could be predatory. In his latest blog post, Simon Linacre looks at the role the International Standard Serial Number, or ISSN, can play in the production of predatory journals. 

For many reasons, the year 2020 will be remembered for the sheer volume of numbers that have invaded our consciousness. Some of these are big numbers – 80 million votes for Joe Biden, four million cases of COVID in the US in 2020 – and some of these will be small, such as the number of countries (1) leaving the EU at the end of the year. Wherever we look, we see numbers of varying degrees of import at seemingly every turn.

While numbers have been previously regarded as gospel, however, data has joined news and UFO sightings (seemingly one of the few phenomena NOT to increase in 2020) as something to be suspicious about or faked in some way. And one piece of data trusted by many authors in determining the validity or otherwise of a journal is the International Standard Serial Number, or ISSN.

An ISSN can be obtained relatively easily via either a national or international office as long as a journal can be identified as an existing publication. As the ISSN’s own website states, an ISSN is “a digital code without any intrinsic meaning” and does not include any information about the contents of that publication. Perhaps most importantly, an ISSN “does not guarantee the quality or the validity of the contents”. This perhaps goes some way to explain why predatory journals can often include an ISSN on their websites. Indeed, more than 40% of the journals included in Cabells’ Predatory Reports database include an ISSN in their journal information.

But sometimes predatory publishers can’t obtain an ISSN – or at least can’t be bothered to – and will fake the ISSN code. Of the 6,000 or so journals with an ISSN in Predatory Reports, 288 or nearly 5% have a fake ISSN, and this is included as one of the database’s behavioural indicators to help identify predatory activity. It is instructive to look at these fake ISSNs to see the lengths predatory publishers will go to in order to achieve some semblance of credibility in their site presence.

For some journals, it is obvious that the ISSN is fake as it looks wrong. In the example above for the Journal of Advanced Statistics and Probability, the familiar two groups of four digits followed by a hyphen format is missing, replaced by nine digits and a forward slash, which is incorrect.

For other journals, such as the Global Journal of Nuclear Medicine and Biology below, the format is correct, but a search using the ISSN portal brings up no results, so the ISSN code is simply made up.

More worrying are the few publications that have hijacked existing, legitimate journals and appropriated their identity, including the ISSN. In the example below, the Wulfenia Journal has had its identity hijacked, with the fake journal website pictured below.

If you compare it to the genuine journal shown below (the German homepage can be found here), you can see they list the same ISSN.

One can only imagine the chaos caused for a legitimate journal when its identity is hijacked, and this is just part of wider concerns on the effects of fake information being shared have on society. As always, arming yourself with the right information – and taking a critical approach to any information directed your way – will help see you through the morass of misinformation we seem to be bombarded with in the online world.

Cabells and Inera present free webinar: Flagging Predatory Journals to Fight “Citation Contamination”

Cabells and Inera are excited to co-sponsor the free on-demand webinar “Flagging Predatory Journals to Fight ‘Citation Contamination'” now available to stream via SSP OnDemand. Originally designed as a sponsored session for the 2020 SSP Annual Meeting, this webinar is presented by Kathleen Berryman of Cabells and Liz Blake of Inera, with assistance from Bruce Rosenblum and Sylvia Izzo Hunter, also from Inera.

The webinar outlines an innovative collaborative solution to the problem of “citation contamination”—citations to content published in predatory journals with a variety of bad publication practices, such as invented editorial boards and lack of peer review, hiding in plain sight in authors’ bibliographies.

DON’T MISS IT! On Thursday, November 12, 2020, at 11:00 am Eastern / 8:00 am Pacific, Kathleen and Liz will host a live screening with real-time chat followed by a Q&A!

For relevant background reading on this topic, we recommend these Scholarly Kitchen posts:

The A-Z’s of predatory publishing

Earlier this year Cabells (@CabellsPublish) published an A-Z list of issues regarding predatory publishing practices, with one Tweet a week going through the entire alphabet. In this week’s blog, Simon Linacre republishes all 26 tweets in one place as a primer on how to successfully deal with the phenomenon

A = American. US probable source of #PredatoryJournals activity as it lends credence to claims of legitimacy a journal may adopt to hoodwink authors into submitting articles #PredatoryAlphabet #PredatoryJournal #PublicationEthics

B = British. Questionable journals often use ‘British’ as a proxy for quality. Over 600 include this descriptor in the Predatory Reports database, many more than in the Journalytics database of recommended journals

C = Conferences. Predatory conferences bear the same hallmarks as #PredatoryJournals – few named academics involved, too-good-to-be-true fees & poorly worded/designed websites. Some tips here.

D = Dual Publication. Publishing in #PredatoryJournals can lead to dual publication if the same article is then published in a legitimate journal

E = Editor (or lack of). Always check if a journal has an Editor, with an institutional email address and corroborated affiliation. Lack of an Editor or their details can indicate poor quality or predatory nature

F = Font. Look at fonts used by journals in emails or on web pages – clashing colors, outsized lettering or mixed up fonts may indicate predatory behavior

G = Germany, which takes #PredatoryJournals seriously through university-level checks, highlighting the issue and exposing problems in a 2018 investigation

H = H-Index. Authors’ and journals’ #H_index scores can be skewed by #PredatoryJournal activity

I = ISSN. Over 4,000 out of 13,000 (30%) journals on @CabellsPublish Predatory Reports database claim an ISSN (some genuine, some fake)

J = JIF. Always check the Journal Impact Factor (JIF) claims on journal websites as many #PredatoryJournals falsely claim them. Check via @clarivate Master Journal List

K = Knowledge. Spotting #PredatoryJournals and recognising legitimate #Journals can be important as subject knowledge for publication. ‘Research your research’ by using tools such as @CabellsPublish databases

L = Legitimacy. First responsibility for authors is to check legitimacy of journals – use checklists, peer recommendations and @CabellsPublish Predatory Reports database

M = Membership scams. Beware any journal that offers membership of a group as a condition of publication. Check existence of group and journal credentials on @CabellsPublish Journal databases

N = Nature. Using a trusted scholarly brand such as @nature can help identify, understand and define #PredatoryJournals, with dozens of articles on the subject via @NatureNews

O = OMICS. In 2019, the FTC fined publisher OMICS over $50m for deceptive publishing practices

P = Predatory Reports. The new name for the @CabellsPublish database of 13,400 #PredatoryJournals

Q = Quick publication. Peer review is a long process  typically lasting months. Beware journals offering quick #PeerReview and publication in days or even weeks

R = Research. Academic #Research should also include research on #Journals to enable #PublicationEthics and #ResearchIntegrity are followed. Use @CabellsPublish Predatory Reports to support your research

S = Spam. Journal emails to solicit articles are often predatory (see below for ironic example). Authors can check the legitimacy of journals by using the @CabellsPublish Predatory Reports database

T = Top Tips. Use our research on #PredatoryJournals and how to spot #PredatoryPublishing practices

U = Unknown. Never trust your research to any unknown editor, journal or publisher #PredatoryJournals #PredatoryAlphabet via @CabellsPublish

V = Vocabulary. Look out for ‘unusual’ vocabulary in predatory journal solicitation emails (i.e., “…your precious paper…”) #PredatoryJournals #PredatoryAlphabet via
@CabellsPublish

W = Website. Look for red flags on a journal’s website such as dead links, poor grammar/spelling, no address for journal or publisher, or no contact info (just web-form)

X = There are no current ISSNs with more than one ‘X’ at the end, which serves as a check digit. Be aware of fake ISSNs with multiple X’s

Y = Yosemite Sam. Always check the backgrounds of Editors and Editorial Board members of journals – if they are a cartoon character, maybe don’t submit your article

Z = Zoology. The US Open Zoology Journal is one of 35 zoology titles on the @CabellsPublish Predatory Reports database. It is not open, is not from the US and has no articles, but will accept your $300 to submit an article