Cabells renews partnership with CLOCKSS to further shared goals of supporting scholarly research

Cabells is excited to announce the renewal of its partnership with CLOCKSS, the decentralized preservation archive that ensures the long-term survival of scholarly content in digital format. Cabells is pleased to provide complimentary access to the Journal Whitelist and Journal Blacklist databases for an additional two years to CLOCKSS, to further the organizations’ shared goals of supporting and preserving scholarly publications for the benefit of the global research community.

The goal of Cabells is to provide academics with the intelligent data needed for comprehensive journal evaluations to safeguard scholarly communication and advance the dissemination of high-value research.  Assisting CLOCKSS in its mission to provide secure and sustainable archives for the preservation of academic publications in their original format is a logical and rewarding collaboration.

“We are proud to renew our partnership with CLOCKSS. Our mission to protect the integrity of scholarly communication goes hand in hand with their work to ensure the secure and lasting preservation of scholarly research,” said Kathleen Berryman, Director of Business Relations with Cabells.

In helping to protect and preserve academic research, Cabells and CLOCKSS are fortunate to play vital roles in the continued prosperity of the scholarly community.


 

About: Cabells – Since its founding over 40 years ago, Cabells services have grown to include both the Journal Whitelist and the Journal Blacklist, manuscript preparation tools, and a suite of powerful metrics designed to help users find the right journals, no matter the stage of their career. The searchable Journal Whitelist database includes 18 academic disciplines from more than 11,000 international scholarly publications. The Journal Blacklist is the only searchable database of predatory journals, complete with detailed violation reports. Through continued partnerships with major academic publishers, journal editors, scholarly societies, accreditation agencies, and other independent databases, Cabells provides accurate, up-to-date information about academic journals to more than 750 universities worldwide. To learn more, visit www.cabells.com.

About: CLOCKSS is a not-for-profit joint venture between the world’s leading academic publishers and research libraries whose mission is to build a sustainable, international, and geographically distributed dark archive with which to ensure the long-term survival of Web-based scholarly publications for the benefit of the greater global research community.

Bringing clarity to academic publishing

How do you know if a journal is a good or a bad one? It is a simple enough question, but there is a lack of clear information out there for researchers, and often scams that lay traps for the unaware. In his latest post, Simon Linacre presents some new videos from Cabells that explain what it does to ensure authors can keep fully informed.


On a chilly Spring day in Edinburgh, myself and one of my colleagues were asked to do what nobody really wants to do if they can help it, and that is to ‘act natural’. It is one of life’s great ironies that it is so difficult to act naturally when told to do so. However, it was for a good cause, as we had been asked to explain to people through a short film what it was that Cabells did and why we thought it was important.

Video as a medium has been relatively ignored by scholarly publishers until quite recently. Video has of course been around for decades, and it has been possible to embed video on websites next to articles for a number of years. However, embedding video into pdfs has been tricky, and as every publisher will tell you when they ask you about user needs – academics ‘just want the pdf’. As a result, there has been little in the way of innovation when it comes to scholarly communication, despite some brave attempts such as video journals, video abstracts and other accompaniments to the humble article.

Video has been growing as a means of search, particularly for younger academics, and it can be much more powerful when it comes to engagement and social media. Stepping aside from the debate about what constitutes impact and whether Altmetrics and hits via social media really mean anything, video can be ‘sticky’ in the sense that people spend longer watching it than skipping over words on a web page. As such, the feeling is that video is a medium whose time may have yet to come when it comes to scholarly communications.

So, in that spirit, Cabells has shot a short video with some key excerpts that take people through the Journal Whitelist and Journal Blacklist. It is hoped that it answers some questions that people may have, and spurs others to get in touch with us. The idea of the film is the first step towards Cabells’ development of a number of resources in lots of different platforms that will help researchers drink in knowledge of journals to optimize their decision-making. In a future of Open Access, new publishing platforms, and multiple publishing choices, the power to publish will increasingly be in the hands of the author, with the scholarly publishing industry increasingly seeking ways to satisfy their needs. Knowledge about publishing is the key to unlocking that power.

Updated CCI and DA metrics hit the Journal Whitelist

Hot off the press, newly updated Cabell’s Classification Index© (CCI©) and Difficulty of Acceptance© (DA©) scores for all Journal Whitelist publication summaries are now available. These insightful metrics are part of our powerful mix of intelligent data leading to informed and confident journal evaluations.

Research has become increasingly cross-disciplinary and, accordingly, an individual journal might publish articles relevant to several fields.  This means that researchers in different fields often use and value the same journal differently. Our CCI© calculation is a normalized citation metric that measures how a journal ranks compared to others in each discipline and topic in which it publishes and answers the question, “How and to whom is this journal important?” For example, a top journal in computer science might sometimes publish articles about educational technology, but researchers in educational technology might not really “care” about this journal the same way that computer scientists do. Conversely, top educational technology journals likely publish some articles about computer science, but these journals are not necessarily as highly regarded by the computer science community. In short, we think that journal evaluations must be more than just a number.

CCI screenshot 2019 updates

The CCI© gauges how well a paper might perform in specific disciplines and topics and compares the influence of journals publishing content from different disciplines. Further, within each discipline, the CCI© classifies a journal’s influence for each topic that it covers. This gives users a way to evaluate not just how influential a journal is, but also the degree to which a journal influences different disciplines.

For research to have real impact it must first be seen, making maximizing visibility a priority for many scholars. Our Difficulty of Acceptance© (DA©) metric is a better way for researchers to gauge a journal’s exclusivity to balance the need for visibility with the very real challenge of getting accepted for publication.

DA screenshot 2019 updates

The DA© rating quantifies a journal’s history of publishing articles from top-performing research institutions. These institutions tend to dedicate more faculty, time, and resources towards publishing often and in “popular” journals. A journal that accepts more articles from these institutions will tend to expect the kind of quality or novelty that the availability of resources better facilitates. So, researchers use the DA© to find the journals with the best blend of potential visibility and manageable exclusivity.

For more information on our metrics, methods, and products, please visit www.cabells.com.

The Journal Blacklist surpasses the 12,000 journals listed mark

Just how big a problem is predatory publishing? Simon Linacre reflects on the news this week that Cabells announced it has reached 12,000 journals on its Journal Blacklist and shares some insights into publishing’s dark side.


Predatory publishing has seen a great deal of coverage in 2019, with a variety of sting operations, opinion pieces and studies published on various aspects of the problem. It seems that while on the one side, there is no doubt that it is a problem for academia globally, on the other side there is huge debate as to the size, shape and relative seriousness of that problem.

On the first of those points, the size looks to be pretty big – Cabells announced this week that its Journal Blacklist has hit the 12,000 mark. This is less than a year since it hit 10,000, and it is now triple the size it was when it was launched in 2017. Much of this is to do with the incredibly hard work of its evaluations team, but also because there are a LOT of predatory journals out there, with the numbers increasing daily.

On the last of those points, the aftershocks of the Federal Trade Commission’s ruling against OMICS earlier this year are still being felt. While there is no sign of any contrition on the part of OMICS – or of the $50m fine being paid – the finding has garnered huge publicity and acted as a warning for some academics not to entrust their research with similar publishers. In addition, it has been reported that CrossRef has now cut OMICS membership.

However, the shape of the problem is still hard for many to grasp, and perhaps it would help to share some of the tools of the trade of deceptive publishers. Take one journal on the Cabells Journal Blacklist – the British Journal of Marketing Studies.

Cabells Blacklist Screenshot

Sounds relatively normal, right? But a number of factors relating to this journal highlight many of the problems presented by deceptive journals:

  • The title includes the word ‘British’ as a proxy for quality, however, over 600 journals include this descriptor in the Blacklist compared to just over 200 in Scopus’ entire index of over 30,000 journals
  • The journal is published by European-American Journals alongside 81 other journals – a remarkable feat considering the publisher lists a small terraced house in Gillingham as its main headquarters
  • When Cabells reviewed it for inclusion in the Blacklist, it noted among other things that:
    • It falsely claimed to be indexed in well-known databases – we know this because among these was Cabells itself
    • It uses misleading metrics, including an “APS Impact Factor” of 6.80 – no such derivation of the Web of Science metric exists, apart from on other predatory journal sites
    • There is no detailed peer review policy stated
    • There is no affiliation for the Editor, one Professor Paul Simon, and searches cannot uncover any marketing professors with such a name (or a Prof. Garfunkel, for that matter)

This IS a problem for academia because, no matter what the size and seriousness of predatory publishing may be unless researchers learn to spot the signs of what it looks like, they will continue to get drawn in and waste their research, funding dollars, and even career, on deceptive publishing practices.

When does research end and publishing begin?

In his latest post, Simon Linacre argues that in order for authors to make optimal decisions – and not to get drawn into predatory publishing nightmares – research and publishing efforts should overlap substantially.


In a recent online discussion on predatory publishing, there was some debate as to the motivations of authors to chose predatory journals. A recent study in the ALPSP journal Learned Publishing found that academics publishing in such journals usually fell into one of two camps – either they were “uninformed” that the journal they had chosen to publish in was predatory in nature, or they were “unethical” in knowingly choosing such a journal in order to satisfy some publication goals.

However, a third category of researcher was suggested, that of the ‘unfussy’ author who neither cares nor knows what sort of journal they are publishing in. Certainly, there may be some overlap with the other two categories, but what they all have in common is bad decision-making. Whether one does not know, does not care, or does not mind which journal one publishes in, it seems to me that one should do so on all three counts.

It was at this point where one of the group posed one of the best questions I have seen in many years in scholarly communications: when it comes to article publication, where does the science end in scientific research? Due in part to the terminology as well as the differing processes, the concept of research and publication are regarded as somehow distinct or separate. Part of the same eco-system, for sure, but requiring different skills, knowledge and approaches. The question is a good one as it challenges this duality. Isn’t is possible for science to encompass some of the publishing process itself? And shouldn’t the publishing process become more involved in the process of research?

The latter is already happening to a degree in moves by major publishers to climb up the supply chain and become more involved in research services provision (e.g. the acquisition of article platform services provider Atypon by Wiley). On the other side, there is surely an argument that at the end of experiments or data collection, analyzing data logically and writing up conclusions, there is a place for scientific process to be followed in choosing a legitimate outlet with appropriate peer review processes? Surely any university or funder would expect such a scientific approach at every level from their employees or beneficiaries. And a failure to do this allows in not only sub-optimal choices of journal, but worse predatory outlets which will ultimately delegitimize scientific research as a whole.

I get that that it may not be such a huge scandal if some ho-hum research is published in a ‘crappy’ journal so that an academic can tick some boxes at their university. However, while the outcome may not be particularly harmful, the tacit allowing of such lazy academic behavior surely has no place in modern research. Structures that force gaming of the system should, of course, be revised, but one can’t help thinking that if academics carried the same rigor and logic forward into their publishing decisions as they did in their research, scholarly communications would be in much better shape for all concerned.

Still without peer?

Next week the annual celebration of peer review takes place, which despite being centuries old is still an integral part of scholarly communications. To show Cabells’ support of #PeerReviewWeek, Simon Linacre looks at why peer review deserves its week in the calendar and to survive for many years to come.


I was recently asked by Cabells’ partners Editage to upload a video to YouTube explaining how the general public benefited from peer review. This is a good question, because I very much doubt the general public is aware at all of what peer review is and how it impacts their day-to-day lives. But if you reflect for just a moment, it is clear it impacts almost everything, much of which is taken for granted on a day-to-day basis.

Take making a trip to the shops. A car is the result of thousands of experiments and validated peer review research over a century to come up with the safest and most efficient means of driving people and things from one place to another; each supermarket product has been health and safety tested; each purchase uses digital technology such as the barcode that has advanced through the years to enable fast and accurate purchasing; even the license plate recognition software that gives us a ticket when we stay too long in the car park will be a result of some peer reviewed research (although most people may struggle to describe that as a ‘benefit’).

So, we do all benefit from peer review, even if we do not appreciate it all the time. Does that prove the value of peer review? For some, it is still an inefficient system for scholarly communications, and over the years a number of platforms have sought to disrupt it. For example, PLoS has been hugely successful as a publishing platform where a ‘light touch peer review’ has taken place to enable large-scale, quick turnaround publishing. More recently, F1000 has developed a post-publication peer review platform where all reviews are visible and take place on almost all articles that are submitted. While these platforms have undoubtedly offered variety and author choice to scientific publishing processes, they have yet to change the game, particularly in social sciences where more in-depth peer review is required.

Perhaps real disruption will be seen to accommodate peer review rather than change it. This week’s announcement at the ALPSP Conference by Cactus Communications – part of the same organization as Editage – of an AI-powered platform that can allow authors to submit articles to be viewed by multiple journal editors may just change the way peer review works. Instead of the multiple submit-review-reject cycles authors have to endure, they can submit their article to a system that can check for hygiene factor quality characteristics and relevance to journals’ coverage, and match them with potentially interested editors who can offer the opportunity for the article to then be peer reviewed.

If it works across a good number of journals, one can see that from the perspective of authors, editors and publishers, it would be a much more satisfactory process than the traditional one that still endures. And a much quicker one to boot, which means that the general public should see the benefits of peer review all the more speedily.

Cabells is proud to be COUNTER Release 5 Compliant

Cabells is excited to have passed an independent COUNTER audit, the final step to being deemed fully compliant with the COUNTER Release 5 Code of Practice.

COUNTER tweet

COUNTER (Counting Online Usage of NeTworked Electronic Resources) is a non-profit organization that helps libraries from around the world determine the value of electronic resources provided by different vendors by setting standards for the recording and reporting of usage stats in a consistent and compatible way. The COUNTER Code of Practice was developed with the assistance of library, publisher, and vendor members through working groups and outreach.

By implementing the Code of Practice, publishers and vendors support their library customers by providing statistics in a way that allows for meaningful analysis and cost comparison. This allows libraries to closely asses user activity, calculate cost-per-use data, and make informed purchasing and infrastructure planning decisions, ensuring limited funds are spent in the most efficient way possible.

For more information, check out the COUNTER website which includes their Registries of Compliance.

Agile thinking

In early November, Cabells is very pleased to be supporting the Global Business School Network (GBSN) at its annual conference in Lisbon, Portugal. In looking forward to the event, Simon Linacre looks at its theme of ‘Measuring the Impact of Business Schools’, and what this means for the development of scholarly communications.


For those of you not familiar with the Global Business School Network, they have been working with business schools, industry and charitable organizations in the developing world for many years, with the aim of enhancing access to high quality, highly relevant management education. As such, they are now a global player in developing international networking, knowledge-sharing and collaboration in wider business education communities.

Cabells will support their Annual Conference in November in its position as a leader in publishing analytics, and will host a workshop on ‘Research Impact for the Developing World’. This session will focus on the nature of management research itself – whether it should focus on global challenges rather than just business ones, and whether it can be measured effectively by traditional metrics, or if new ones can be introduced. The thinking is that unless the business school community is more pro-active about research and publishing models themselves, wider social goals will not be met and an opportunity lost to set the agenda globally.

GBSN and its members play a pivotal role here, both in seeking to take a lead on a new research agenda and also in seizing an opportunity to be at the forefront of what relevant research looks like and how it can be incentivized and rewarded. With the advent of the United Nations Sustainable Development Goals (SDGs) – a “universal call to action to end poverty, protect the planet and ensure that all people enjoy peace and prosperity” – not only is there an increased push to change the dynamics of what prioritizes research, there comes with it a need to assess that research in different terms. This question will form the nub many of the discussions in Lisbon in November.

So, what kind of new measures could be applied? Well firstly, this does assume that measures can be applied in the first place, and there are many people who think that any kind of measurement is unhelpful and unworkable. However, academic systems are based around reward and recognition, so to a greater or lesser degree, it is difficult to see measures disappearing completely. Responsible use of such measures is key, as is the informed use of a number of data points available – this is why Cabells includes unique data such as time to publication, acceptance rates, and its own Cabells Classification Index© (CCI©) which measures journal performance using citation data within subject areas.

In a new research environment, just as important will be new measures such as Altmetrics, which Cabells also includes in its data. Altmetrics can help express the level of online engagement research publications have had, and there is a feeling that this research communications space will become much bigger and more varied as scholars and institutions alike seek new ways to disseminate research information. This is one of the most exciting areas of development in research at the moment, and it will be fascinating to see what ideas GBSN and its members can come up with at their Annual Conference.

If you would like to attend the GBSN Annual Conference, Early Bird Registration is open until the 15th September.

FTC v. OMICS: a landmark predatory publishing case

In March of 2019, upon review of numerous allegations of predatory practices against the publisher OMICS International, the U.S. District Court for the District of Nevada ordered OMICS to pay $50.1 million in damages. The case marks one of the first judgments against a publisher accused of predatory practices and could be a signal of greater publisher oversight to come.


In March of this year, a US federal court ordered OMICS International to pay over $50 million in damages stemming from a 2016 lawsuit brought by the Federal Trade Commission (FTC), the first such action against a ‘predatory’ publisher.  The FTC was moved to act against the Hyderabad, India-based open access publisher and its owner, Srinubabu Gedela, after receiving a multitude of complaints from researchers concerning several systematic fraudulent practices.

In April we wondered if this decision would be more than a public record and condemnation of OMICS’ practices, but also act as a deterrent to other similar operations. Stewart Manley, a lecturer for the Faculty of Law at the University of Malaya, has gone deeper in examining this topic in two recent articles: “On the limitations of recent lawsuits against Sci‐Hub, OMICS, ResearchGate, and Georgia State University” (subscription required) featured in the current issue of Learned Publishing, and “Predatory Journals on Trial: Allegations, Responses, and Lessons for Scholarly Publishing from FTC v. OMICS” from the April issue of Journal of Scholarly Publishing (subscription required).

Mr. Manley was also recently interviewed for Scholastica’s blog where he addressed several key questions on this topic and felt that other questionable publishers will likely not be deterred if OMICS wins on appeal or simply refuses to comply with the order. He also lays out the key takeaways from FTC v. OMICS for publishers, academics, and universities.

Another recent article, “OMICS, Publisher of Fake Journals, Makes Cosmetic Changes to Evade Detection” by Dinesh C. Sharma for India Science Wire discusses a recent study showing the evolution of OMICS journals to mimic legitimate journals, making it difficult to distinguish between authentic and fake journals using the standard criteria. Rather than make substantive changes to their practices, OMICS is finding ways to more effectively evade quality checks.

Despite the hits OMICS has taken in actual court and in the court of public opinion, with an appeal in the offing, the final outcome of this matter is still to be determined. Additionally, as Mr. Manley points out in his Q&A, enforcing a judgment such as this is difficult, especially when the defendant is from a foreign jurisdiction. OMICS has yet to comply with the order and there is little reason to believe they ever will. We will continue to monitor this case and will provide updates as they become available.

The power of four

After hearing so many different ways that its Journal Whitelist and Journal Blacklist have been used by customers, Cabells has started to map out how any researcher can use journal data to optimize their decision-making. Fresh from its debut at the World Congress on Research Integrity in Hong Kong last month, Simon Linacre shares the thinking behind the new ‘Four Factor Framework’ and how it could be used in the future.


The 6th World Congress on Research Integrity (WCRI) was held in Hong Kong last month, bringing together the great and the good of those seeking to improve the research process globally. Topics were surprisingly wide, taking a look at such diverse areas as human rights, predatory publishing, data sharing, and policy. It was significant that while much of the focus of the conference was on the need to improve education and learning on how to conduct research ethically, many of the cases presented showed that there is still much to do in this respect.

Cabells was also there and used its presence to share some ideas on how to overcome some of these challenges, particularly with regard to engagement with improved research and publishing practices. Taking the established issues within predatory publishing encountered the world over as a starting point (i.e. choosing the wrong journal), as well as the need to look at as much data as possible (i.e. choosing the right journal), Cabells has very much put the author at the center of its thinking to develop what it has called the ‘Four Factor Framework’:

 

The framework, or FFF, firstly puts the onus on the researcher to rule out any poor, deceptive or predatory journals, using resources such as the Blacklist. This ‘negative’ first step then opens up the next stage, which is to take the four following factors into account before submitting to a research paper to a journal:

  • Strategic: understanding how a journal will impact career ambitions or community perspectives
  • Environmental: bringing in wider factors such as research impact or ethical issues
  • Political: understanding key considerations such as publishing in titles on journal lists, avoiding such lists or journals based in certain areas
  • Cultural: taking into account types of research, peer review or article form

Having talked to many customers over a period of time, these factors all become relevant to authors at some point during that crucial period when they are choosing which journal to publish in. Customers have fed back to Cabells that use of Cabells’ Whitelist and Blacklist – as well as other sources of data and guidance – can be related to as benchmarking, performance-focused or risk management. While it is good to see that the databases can help so many authors in so many different ways, judging by the evidence at WCRI there is still a huge amount to do in educating researchers to take advantage of these optimized approaches. And this will be the main aim of Cabells’ emerging strategy – to enable real impact by researchers and universities through the provision of validated information and support services around scholarly publishing.