Book review – Gaming the Metrics: Misconduct and Manipulation in Academic Research

The issues of gaming metrics and predatory publishing undoubtedly go hand-in-hand, outputs from the same system that requires academic researchers the world over to sing for their supper in some form or other. However, the two practices are often treated separately, almost as if there was no link at all, so editors Biagioli and Lippman are to be congratulated in bringing them together under the same roof in the shape of their book Gaming the Metrics: Misconduct and Manipulation in Academic Research (MIT Press, 2020).

The book is a collection of chapters that cover the whole gamut of wrongheaded – or just plain wrong – publication decisions on behalf of authors the word over on where to publish the fruits of their research. This ‘submission decision’ is unenviable, as it inevitably shapes academic careers to a greater or lesser degree. The main reason why authors make poor decisions is laid firmly at the doors of a variety of ‘publish or perish’ systems which seek to quantify the outputs from authors with a view to… well, the reason why outputs are quantified is never really explained. However, the reason why such quantification should be a non-starter is well-argued by Michael Power in Chapter 3, as well as Barbara M. Kehm (Ch. 6) in terms of the ever-popular university rankings. Even peer review comes under attack from Paul Wouters (Ch. 4), but as with the other areas any solutions are either absent, or in the case of Wouters proffered with minimal detail or real-world context.

Once into the book, any author would quickly realize that their decision to publish is fraught with difficulty with worrying about predatory publishers lurking on the internet to entice their articles and APCs from them. As such, any would be author would be well advised to heed the call ‘Caveat scriptor’ and read this book in advance of sending off their manuscript to any journals.

That said, there is also a case for advising ‘caveat lector’ before would-be authors read the book, as there are other areas where additional context would greatly help in addressing the problems of gaming metrics and academic misconduct. When it comes to predatory journals, there is a good deal of useful information included in several of the later chapters, especially the case studies in Chapters 7 and 15 which detail a suspiciously prolific Czech author and sting operation, respectively.

Indeed, these cases provide the context that is perhaps the single biggest failing of the book, which through its narrow academic lens doesn’t quite capture the wider picture of why gaming metrics and the scholarly communications system as a whole is ethically wrong, both for those who perpetrate it and arguably the architects of the systems. As with many academic texts that seek to tackle societal problems, the unwillingness to get dirt under the fingernails in the pursuit of understanding what’s really going on simply distances the reader from the problem at hand.

As a result, after reading Gaming the Metrics, one is like to simply shrug one’s shoulders in apathy about the plight of authors and their institutions, whereas a great deal more impact might have been achieved if the approach had been less academic and included more case studies and insights into the negative impact resulting from predatory publishing practices. After all, the problem with gaming the system is that, for those who suffer, it is anything but a game.

Gaming the Metrics: Misconduct and Manipulation in Academic Research, edited by Mario Biagioli and Alexandra Lippman (published Feb. 21 2020, MIT Press USA) ISBN: 978-0262537933.

Cabells and scite partner to bring Smart Citations to Journalytics

Cabells, a provider of key intelligence on academic journals for research professionals, and scite, a platform for discovering and evaluating scientific articles, are excited to announce the addition of scite’s Smart Citations to Cabells Journalytics publication summaries.

Journalytics summary card with scite Smart Citations data

Journalytics is a curated database of over 11,000 verified academic journals spanning 18 disciplines, developed to help researchers and institutions optimize decision-making around the publication of research. Journalytics summaries provide publication and submission information and citation-backed data and analytics for comprehensive evaluations.

scite’s Smart Citations allow researchers to see how articles have been cited by providing the context of the citation and a classification describing whether it provides supporting or disputing evidence for the cited claim.

The inclusion of Smart Citations adds a layer of perspective to Journalytics metrics and gives users a deeper understanding of journal activity by transforming citations from a mere number into contextual data.

Lacey Earle, executive director of Cabells, says, “Cabells is thrilled to partner with scite in order to help researchers evaluate scientific articles through an innovative, comparative-based metric system that encourages rigorous and in-depth research.”

Josh Nicholson, co-founder and CEO of scite says of the partnership, “We’re excited to be working with Cabells to embed our Smart Citations into their Journalytics summaries. Smart Citations help you assess the quantity of citations a journal has received as well as the quality of these citations, with a focus on identifying supporting and disputing citations in the literature.”


about cabells

Cabells generates actionable intelligence on academic journals for research professionals.  On the Journalytics platform, an independent, curated database of more than 11,000 verified scholarly journals, researchers draw from the intersection of expertise, data, and analytics to make confident decisions to better administer research. In Predatory Reports, Cabells has undertaken the most comprehensive and detailed campaign against predatory journals, currently reporting on deceptive behaviors of over 14,000 publications. By combining its efforts with those of researchers, academic publishers, industry organizations, and other service providers, Cabells works to create a safe, transparent and equitable publishing ecosystem that can nurture generations of knowledge and innovation. For more information please visit Cabells or follow us on Twitter, LinkedIn and Facebook.

about scite

scite is a Brooklyn-based startup that helps researchers better discover and evaluate scientific articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or disputing evidence. scite is used by researchers from dozens of countries and is funded in part by the National Science Foundation and the National Institute of Drug Abuse of the National Institutes of Health. For more information, please visit scite, follow us on Twitter, LinkedIn, and Facebook, and download our Chrome or Firefox plugin. For careers, please see our jobs page.

The impact of blockchain tech on academic publishing

As blockchain technology continues to branch out well beyond the cryptocurrency world that initially brought it into being, it’s becoming clear it has many potential applications in education. In all likelihood we’re still in the early stages of the technology’s emergence in this field, and the applications will only continue to expand. Even now though, there are some interesting examples.

Perhaps most notable is that student records are becoming easier to keep track of and maintain securely because of blockchain technology. From basic information, to academic transcripts, to notes on course and extracurricular activity participation, there is a lot of information that educational institutions need to maintain and monitor. This can all be made a great deal easier if the information is entered into an incorruptible digital ledger — particularly when the time comes that the information needs to be transferred from one administrator or school to another.

Another important use is in the distribution of degrees. This is a process pioneered via digital diplomas from multiple universities, and has clear benefits for graduates, recruiters, and prospective employers alike. Turning a degree into a digital document turns it into a component of what we might almost refer to as an electronic résumé, making it easier for recruiters and employers to access and verify. The same practice may also become particularly useful with regard to online degrees where there is no in-person component to education. Business education in particular has developed very quickly online, with both online bachelors in business administration and MBAs leading candidates into fields growing far more quickly than other job markets.

These examples — keeping student records and turning degrees into digital documents — cover much of blockchain’s expansion into the realm of academia. Our primary focus here, however, is on another application that tends not to generate as much attention – or at least hasn’t yet. It has become clear that there are also various ways in which blockchain tech can significantly impact the world of academic publishing:

Mitigating Market Issues

While most people probably see academic publishing as a straightforward business or a by-the-book process, there are certain issues and inefficiencies that can come into play. These might include plagiarized materials, predatory journals, or any number of other problems. But in a world in which academic publishing occurs via the blockchain it could become easier for the agencies involved to ensure document integrity and spot these kinds of inefficiencies.

Storing Research Data

There’s a great deal of talk in general about data storage on the blockchain. To sum it up, the idea is essentially that blockchain solutions may rapidly supplant both in-house data storage and cloud storage options. It can make data harder to hack yet faster to access, and in theory it can provide virtually limitless storage. This is typically discussed with regard to healthcare and larger industries, but it could affect academic publishing as well.

Enhancing Effectiveness & Quality

The perks just described ultimately amount to a more accountable and higher-quality academic publishing environment. By extension, it could well be that in time, academic journals and other resources that are not published within a blockchain environment are representative of lower quality or less official status. This may not happen in the short term, however, a degree of exclusivity based on practices that could gradually become an industry standard can be a positive step. The blockchain would begin to serve almost as a filter for quality academic practices and publications.

Peer Review Application

Some academic publishers are already experimenting with the idea of utilizing blockchain technology to help peer review processes. Two of the problems of peer review are the sharing of multiple versions of documents to different people and the security required for double blind peer review. Blockchain systems could enable secure sharing with the benefit of certifying the results of peer review for all those involved.

Leveraging Blockchain for Distribution

Finally, academic journal authors may also find that the blockchain can be useful as a means of controlling distribution. Particularly in the modern world where people find so many ways of bypassing paywalls, downloading material freely and so on, it’s easy enough for valuable research and published material to essentially lose its value. Blockchain distribution for published material has the potential to swiftly address this problem, in that said material has to be obtained as the author and/or publisher determine it should be.

In all of these ways and more, blockchain technology is poised to be every bit as important in academic publishing as in other aspects of academia. And it’s likely that the full range of benefits still has yet to be determined.

The fake factor

On the day that the US says goodbye to its controversial President, we cannot bid farewell to one of his lasting achievements, which is to highlight issues of fake news and misinformation. Simon Linacre looks at how putting the issue in the spotlight could at least increase people’s awareness… and asks for readers’ help to do so.

Cabells completed around a dozen webinars with Indian universities towards the end of 2020 in order to share some of our knowledge of predatory publishing, and also learn from librarians, faculty members and students what their experiences were. Studies have shown that India has both the highest number of predatory journals based there and most authors publishing in them, as well as a government as committed as any to dealing with the problem, so any insight from the region is extremely valuable.

Q&A sessions following the webinars were especially rich, with a huge range of queries and concerns raised. One specific query raised a number of issues: how can researchers know if the index a journal says it is listed in is legitimate or not? As some people will be aware, one of the tricks of the trade for predatory publishers is to promote indices their journals are listed in, which can come in several types:

  • Pure lies: These are journals that say they have an ‘Impact Factor’, but are not listed by Clarivate Analytics in its Master Journal List of titles indexed on Web of Science (and therefore have an Impact Factor unless only recently accepted)
  • Creative lies: These journals say they are listed by an index, which is true, but the index is little more than a list of journals which say they are listed by the index, with the addition of the words ‘Impact Factor’ to make it sound better (eg. ‘Global Impact Factor’ , ‘Scholarly Article Impact Factor’)
  • Nonsensical lies: These are links (or usually just images) to seemingly random words or universities that try to import some semblance of recognition, but mean nothing. For example, it may be a name of a list, service or institution, but a quick search elicits nothing relating those names with the journal
  • White lies: One of the most common, many predatory journals say they are ‘listed’ or ‘indexed’ by Google Scholar. While it is true to say these journals can be discovered by Google Scholar, they are not listed or indexed for the simple reason that GS is not a list or an index

When Jeffrey Beall was active, he included a list of ‘Misleading Metrics’ on his blog that highlighted some of these issues. A version or versions of this can still be found today, but are not linked to here because (a) they are out of date by at least four years, and (b) the term ‘misleading’ is, well, misleading as few of the indexes include metrics in the first place, and the metrics may not be the major problem with the index. However, this information is very valuable, and as such Cabells has begun its own research program to create an objective, independently verifiable and freely available list of fake indexes in 2021. And, what’s more, we need your help – if anyone would like to suggest we look into a suspicious looking journal index, please write to me at simon.linacre@cabells.com and we will review the site for inclusion.

Back to basics

As we enter what is an uncertain 2021 for many both personally and professionally, it is worth perhaps taking the opportunity to reset and refocus on what matters most to us. In his latest blog post, Simon Linacre reflects on Cabells’ new video and how it endeavors to show what makes us tick.

It is one of the ironies of modern life that we seem to take comfort in ‘doomscrolling’, that addictive pastime of flicking through Twitter on other social media on the hunt for the next scandal to inflame our ire. Whether it is Brexit, the coronavirus epidemic or alleged election shenanigans, we can’t seem to get enough of the tolls of doom ringing out in our collective echo chambers. As the New Year dawns with little good news to cheer us, we may as well go all in as the world goes to hell in a handcart.

Of course, we also like the lighter moments that social media provide, such as cat videos and epic fails. And it is comforting to hear some stories that renew our faith in humanity. One parent on Twitter remarked this week as the UK’s schools closed and reverted to online learning, that she was so proud of her child who, on hearing the news, immediately started blowing up an exercise ball with the resolve not to waste the opportunity lockdown provided of getting fit.

Reminding ourselves that the glass can be at least half full even if it looks completely empty is definitely a worthwhile exercise, even if it feels like the effort of constantly refilling it is totally overwhelming. At Cabells, our source of optimism has recently come from the launch of our new video. The aim of the video is to go back to basics and explain what Cabells does, why it does it, and how it does it through its two main products – Journalytics and Predatory Reports.

Making the video was a lot of fun, on what was a beautiful sunny Spring day in Edinburgh with one of my US colleagues at an academic conference (remember them?). While nerve-shredding and embarrassing, it was also good to go back to basics and underline why Cabells exists and what we hope to achieve through all the work we do auditing thousands of journals every year.

It also acted as a reminder that there is much to look forward to in 2021 that will keep our glasses at least half full for most of the time. Cabells will launch its new Medical journal database early this year, which will see over 5,000 Medical journals indexed alongside the 11,000 journals indexed in Journalytics. And we also have major upgrades and enhancements planned for both Journalytics and Predatory Reports databases that will help researchers, librarians and funders better analyse journal publishing activities. So, let’s raise a (half full) glass to the New Year, and focus on the light at the end of the tunnel and not the darkness that seems to surround us in early January.

Cabells and AMBA launch list of most impactful Chinese language management journals

In his last blog post in what has been a tumultuous year, Simon Linacre looks forward to a more enlightened 2021 and a new era of open collaboration and information sharing in scholarly communications and higher education.

In a year with so many monumental events, it is perhaps pointless to try and review what has happened. Everyone has lived every moment with such intensity – whether it be through 24-hour news coverage, non-stop social media or simply living life under lockdown – that it seems simply too exhausting to live through it all again. So, let’s fast forward to 2021 instead.

While some of the major concerns from 2020 will no doubt remain well into the New Year, they will also fade away gradually and be replaced by new things that will demand our attention. Difficult as it may seem now, neither Trump, Brexit (for the Brits) nor COVID will have quite the hold on the news agenda as they did, and that means there is an opportunity at least for some more positive news to start to dominate the headlines.

One activity that may succeed in this respect is the open science agenda. With a new budget agreed upon by the European Research Council and a new administration in Washington DC, together with an increasing focus more generally on open science and collaboration, it is to be hoped that there will be enough funding in place to support it. If the recent successes behind the COVID-19 vaccines show anything it is surely that focused, fast, mission-driven research can produce life-changing impacts for a huge number of people. As others have queried, what might happen if the same approach was adopted and supported for tackling climate change?

In the same vein, information sharing and data analysis should also come further to the fore in 2021. While in some quarters, consolidation and strategic partnerships will bring organisations together, in others the importance of data analysis will only become more essential in enabling evidence-based decision-making and creating competitive advantages.

In this way, the announcement today made by Cabells and the Association of MBAs and Business Graduates Association (AMBA & BGA) brings both these themes together in the shape of a new list of quality Chinese-language journals in business and management. The AMBA-Cabells Journal Report (ACJR) has been curated together by both organisations, using the indexing expertise of Cabells and the knowledge of Chinese journals at AMBA & BGA. Both organisations have been all-too-aware of the Western-centric focus of many indices and journal lists, and believe this is a positive first step towards the broadening out of knowledge and understanding of Chinese-language journals, and non-English journals more broadly.

There have also been policy changes in China during 2020 which have meant less reliance on journals with Impact Factors, and more of a push to incentivise publications in high quality local journals. As such, the ACJR should provide a valuable guide to business school authors in China about some of the top journals available to them. The journals themselves were firstly identified using a number of established Chinese sources, as well as input from esteemed scholars and deans of top business schools. Recommended journals were then checked using Google Scholar to ensure they had published consistently over the last five years and attracted high levels of citations.

The new list is very much intended to be an introduction to Chinese-language journals in business and management, and we would very much welcome input from people on the list so we can develop it further for a second iteration in 2021.

For more information on ACJR, visit https://www.associationofmbas.com/ and https://www.cabells.com/ 

Guest Post: A look at citation activity of predatory marketing journals

This week we are pleased to feature a guest post from Dr. Salim Moussa, Assistant Professor of Marketing at ISEAH at the University of Gafsa in Tunisia. Dr. Moussa has recently published insightful research on the impact predatory journals have had on the discipline of marketing and, together with Cabells’ Simon Linacre, has some cautionary words for his fellow researchers in that area.

Academic journals are important to marketing scholars for two main reasons: (a) journals are the primary medium through which they transmit/receive scholarly knowledge; and (b) tenure, promotion, and grant decisions depend mostly on the journals in which they have published. Selecting the right journal to which one would like to submit a manuscript is thus a crucial decision. Furthermore, the overabundance of academic marketing journals -and the increasing “Publish or Perish” pressure – makes this decision even more difficult.

The “market” of marketing journals is extremely broad, with Cabells’ Journalytics indexing 965 publication venues that that are associated with “marketing” in their aims and scope. While monitoring the market of marketing journals for the last ten years, I have noticed that a new type of journal has tapped into it: open access (OA) journals.

The first time I have ever heard about OA journals was during a dinner in an international marketing conference held in April 2015 in my country, Tunisia. Many of the colleagues at the dinner table were enthusiastic about having secured publications in a “new” marketing journal published by “IBIMA”. Back in my hometown (Gafsa), I took a quick look at IBIMA Publishing’s website. The thing that I remember the most from that visit is that IBIMA’s website looked odd to me. Then a few years later, while conducting some research on marketing journals, I noticed some puzzling results for a particular journal. Investigating the case of that journal, I came to realize that a scam journal was brandjacking the identity of the flagship journal of the UK-based Academy of Marketing’s, Journal of Marketing Management.

Undertaking this research, terms such“Predatory publishers”, “Beall’s List”, and “Think, Check, Submit” were new discoveries for me. This was also the trigger point of a painful yet insightful research experience that lasted an entire year (from May 2019 to May 2020).

Beall’s list was no longer available (shutdown in January 2017), and I had no access to Cabells’ Predatory Reports. Freely available lists were either outdated or too specialized (mainly Science, Technology, and Medicine) to be useful. So, I searched for journals that have titles that are identical or confusingly similar to those of well-known, prestigious, non-predatory marketing journals. Using this procedure, I identified 12 journals and then visited the websites of each of these 12 journals to collect information about both the publisher and the journal; that is, is the journal OA or not, its Article Processing Charges, whether the journal had an editor in chief or not, the names of its review board members and their affiliations (if any), the journal’s ISSNs, etc. I even emailed an eminent marketing scholar that I was stunned to see his name included in the editorial board of a suspicious journal.

With one journal discarded, I had a list of 11 suspicious journals (Journal A to Journal K).

Having identified the 11 publishers of these 11 journals, I then consulted three freely available and up-to-date lists of predatory publishers: the Dolos List, the Kscien List, and the Stop Predatory Journals List. The aim of consulting these lists was to check whether I was wrong or right in qualifying these publishers as predatory. The verdict was unequivocal; each of the 11 publishers were listed in all three of them. These three lists, however, provided no reasons for the inclusion of a particular publisher or a particular journal.

To double-check the list, I used the Directory of Open Access Journals, which is a community-curated online directory that indexes and provides access to high-quality, OA, peer-reviewed journals. None of the 11 journals were indexed in it. To triple-check the list, I used both the 2019 Journal Quality List of the Australian Business Deans Council and the 2018 Academic Journal Guide by the Chartered Association of Business Schools. None of the 11 journals were ranked in these two lists either.

To be brief, the one year of endeavor resulted in a paper I submitted to the prestigious academic journal, Scientometrics, published by Springer Nature, and my paper was accepted and published online in late October 2020. In that paper, I reported the findings of a study that examined the extent of citations received by articles published in ten predatory marketing journals (as one of the 11 journals under scrutiny was an “empty journal”; that is, with no archives). The results indicated that some of these journals received quite a few citations with a median of 490 citations, with one journal receiving 6,296 citations (see also Case Study below).

I entitled the article “Citation contagion: A citation analysis of selected predatory marketing journals.” Some people may or may not like the framing in terms of “contagion” and “contamination” (especially in these COVID times), but I wanted the title to be striking enough to attract more readership. Those who read the article may see it as a call for marketing researchers to “not submit their (possibly interesting) knowledge products to any journal before checking that the publication outlet they are submitting to is a non-predatory journal.” Assuming that the number of citations an article receives signals its quality, the findings in my study indicate that some of the articles published in these predatory journals deserved better publication venues. I believe that most of the authors of these articles were well-intentioned and did not know that the journals they were submitting to were predatory. 

A few months earlier and having no access to Cabells databases, I read each of the posts in their blog trying identify marketing journals that were indexed in Predatory Reports. Together with Cabells, our message to the marketing research community is that 10 of the 11 journals that I have investigated were already listed (or under-review for inclusion) in Predatory Reports. I believe my study has revealed only the tip of the iceberg. Predatory Reports now indexes 140 journals related to the subject of Marketing (which represents 1% of the total number of journals listed in Predatory Reports). Before submitting your papers to an OA marketing journal, you can use Predatory Reports to verify that it is legitimate.

Case Study

The study completed by Dr. Moussa provides an excellent primer on how to research and identify predatory journals (writes Simon Linacre). As such, it is instructive to look at one of the journals highlighted in Dr. Moussa’s article in more detail.

Dr. Moussa rightly suspected that the British Journal of Marketing Studies looked suspicious due to its familiar-sounding title. This is a well-used strategy by predatory publishers to deceive authors who do not make themselves familiar with the original journal. In this case, the British Journal of Marketing Studies sounds similar to a number of potential journals in this subject discipline.

As Dr. Moussa also points out, a questionable journal’s website will often fail to stand up to a critical eye. For example, the picture below shows the “offices” of BJMS – a small terraced house in Southern England, which seems an unlikely location for an international publishing house. This journal’s website also contains a number of other tells that while not singularly defining of predatory publishers, certainly provide indicators: prominent phone numbers, reference to an ‘Impact Factor’ (not from Clarivate), fake indexation in databases (eg DOAJ), no editor contact details, and/or fake editor identity.

What is really interesting about Dr. Moussa’s piece is his investigation of citation activity. We can see from the data below that ‘Journal I’ (which is the British Journal of Marketing Studies) that both total citations and the most citations received by a single article are significant, and represent what is known as ‘citation leakage’ where citations are made to and from predatory journals. As articles in these journals are unlikely to have had any peer review, publication ethics checks or proof checks, their content is unreliable and skews citation data for reputable research and journals.

  • Predatory journal: Journal I (BJMS)
  • Total number of citations received: 1,331
  • Number of citations received by the most cited article: 99
  • The most cited article was published in: 2014
  • Number of citations received from SSCI-indexed journals: 3
  • Number of citations received from FT50 listed journals: 0
Predatory Reports entry for BJMS

It is a familiar refrain from The Source, but it bears repeating – as an author you should do due diligence on where you publish your work and ‘research your research’. Using your skills as a researcher for publication and not just what you want to publish will save a huge amount of pain in the future, both for avoiding the bad journals and choosing the good ones.

Cabells and Inera present free webinar: Flagging Predatory Journals to Fight “Citation Contamination”

Cabells and Inera are excited to co-sponsor the free on-demand webinar “Flagging Predatory Journals to Fight ‘Citation Contamination'” now available to stream via SSP OnDemand. Originally designed as a sponsored session for the 2020 SSP Annual Meeting, this webinar is presented by Kathleen Berryman of Cabells and Liz Blake of Inera, with assistance from Bruce Rosenblum and Sylvia Izzo Hunter, also from Inera.

The webinar outlines an innovative collaborative solution to the problem of “citation contamination”—citations to content published in predatory journals with a variety of bad publication practices, such as invented editorial boards and lack of peer review, hiding in plain sight in authors’ bibliographies.

DON’T MISS IT! On Thursday, November 12, 2020, at 11:00 am Eastern / 8:00 am Pacific, Kathleen and Liz will host a live screening with real-time chat followed by a Q&A!

For relevant background reading on this topic, we recommend these Scholarly Kitchen posts:

How do you know you can trust a journal?

As many readers know, this week is Peer Review Week, the annual opportunity for those involved in scholarly communication and research to celebrate and learn about all aspects of peer review. As part of this conversation, Simon Linacre reflects on this year’s theme of ‘Trust in Peer Review’ in terms of the important role of peer review in the validation of scholarship, and dangers of predatory behaviour in its absence.


I was asked to deliver a webinar recently to a community of scholars in Eastern Europe and, as always with webinars, I was very worried about the Q&A section at the end. When you deliver a talk in person, you can tell by looking at the crowd what is likely to happen at the end of the presentation and can prepare yourself. A quiet group of people means you may have to ask yourself some pretty tough questions, as no one will put their hand up at the end to ask you anything; a rowdy crowd is likely to throw anything and everything at you. With a webinar, there are no cues, and as such, it can be particularly nerve-shredding.

With the webinar in question, I waited a while for a question and was starting to prepare my quiet crowd response, when a single question popped up in the chat box:

How do you know you can trust a journal?

As with all the best questions, this floored me for a while. How do you know? The usual things flashed across my mind: reputation, whether it’s published known scholars in its field, whether it is indexed by Cabells or other databases, etc. But suddenly the word trust felt a lot more personal than simply a tick box exercise to confirm a journal’s standing. That may confirm it is trustworthy but is that the same as the feeling an individual has when they really trust something or someone?

The issue of trust is often the unsaid part of the global debates that are raging currently, whether it is responses to the coronavirus epidemic, climate change or democracy. Politicians, as always, want the people to trust them; but increasingly their actions seem to be making that trust harder and harder. As I write, the UK put its two top scientists in front of the cameras to give a grave warning about COVID-19 and a second wave of cases. The fact there was no senior politician to join them was highly symbolic.

It is with this background that the choice of the theme Trust in Peer Review is an appropriate one for Peer Review Week (full disclosure: I have recently joined one of the PRW committees to support the initiative). There is a huge groundswell of support by publishers, editors and academics to support both the effectiveness of peer review and the unsung heroes who do the job for little recognition or reward. The absence of which would have profound implications for research and society as a whole.

Which brings me to the answer to the question posed above, which is to ask the opposite: how do you know when you cannot trust a journal? This is easier to answer as you can point to all those characteristics and behaviours that you would want in a journal. We see on a daily basis with our work on Predatory Reports how the absence of crucial aspects of a journal’s workings can cause huge problems for authors. No listed editor, a fake editorial board, a borrowed ISSN, a hijacked journal identity, a made-up impact factor, and – above all – false promises of a robust peer review process. Trust in peer review may require some research on the part of the author in terms of checking the background of the journal, its publisher and its editors, and it may require you to contact the editor, editorial board members or published authors to get personal advice on publishing in that journal. But doing that work in the first place and receiving personal recommendations will build trust in peer review for any authors who have doubts – and collectively for all members of the academic community.

Special report: Assessing journal quality and legitimacy

Earlier this year Cabells engaged CIBER Research (http://ciber-research.eu/) to support its product and marketing development work. Today, in collaboration with CIBER, Simon Linacre looks at the findings and implications for scholarly communications globally.


In recent months the UK-based publishing research body CIBER has been working with Cabells to better understand the academic publishing environment both specifically in terms of Medical research publications, and more broadly with regard to the continuing problems posed by predatory journals. While the research was commissioned privately by Cabells, it was always with the understanding that much of the findings could be shared openly to enable a better understanding of these two key areas.

The report — Assessing Journal Quality and Legitimacy: An Investigation into the Experience and Views of Researchers and Intermediaries – with special reference to the Health Sector and Predatory Publishinghas been shared today on CIBER’s website and the following briefly summarizes the key findings following six months’ worth of research:

  • The team at CIBER Research was asked to investigate how researchers in the health domain went about selecting journals to publish their papers, what tools they used to help them, and what their perceptions of new scholarly communications trends were, especially in regard to predatory journals. Through a mixture of questionnaire surveys and qualitative interviews with over 500 researchers and ‘intermediaries’ (i.e. librarians and research managers), research pointed to a high degree of self-sufficiency among researchers regarding journal selection
  • While researchers tended to use tools such as information databases to aid their decision-making, intermediaries focused on sharing their own experiences and providing education and training solutions to researchers. Overall, it was notable how much of a mismatch there was between what researchers said and what intermediaries did or believed
  • The existence of so-called ‘whitelists’ were common on a national and institutional level, as were the emergence of ‘greylists’ of journals to be wary of, however, there seemed to be no list of recommended journals in Medical research areas
  • In China, alongside its huge growth in research and publication output are concerns that predatory publishing could have an impact, with one participant stating that, “More attention is being paid to the potential for predatory publishing and this includes the emergence of Blacklists and Whitelists, which are government-sponsored. However, there is not just one there are many 10 or 20 or 50 different (white)lists in place”
  • In India, the explosion of predatory publishing is perhaps the consequence of educational and research expansion and the absence of infrastructure capacity to deal with it. An additional factor could be a lack of significant impetus at a local level to establish new journals, unlike in countries such as Brazil, however, universities are not legally able to establish new titles themselves. As a result, an immature market has attempted to develop new journals to satisfy scholars’ needs which in turn has led to the rise of predatory publishing in the country
  • Predatory publishing practices seemed to be having an increased impact on mainstream publishing activities globally, with grave risk of “potentially polluting repositories and citation indexes but there seems to have been little follow through by anyone.” National bodies, publishers and funders have failed to follow through on the threat and how it may have diverted funds away from legitimate publications to those engaged in illicit activities
  • Overall, predatory publishing is being driven by publish-or-perish scenarios, particularly with early career researchers (ECRs) where authors are unaware of predatory publishers in general, or of the identity of a specific journal. However, a cynical manipulation of such journals as outlets for publications is also suspected.

 

blog image 2
‘Why do you think researchers publish in predatory journals’

 


CIBER Research is an independent group of senior academic researchers from around the world, who specialize in scholarly communications and publish widely on the topic. Their most recent projects have included studies of early career researchers, digital libraries, academic reputation and trustworthiness.