PRW 2023: “Peer Review and The Future of Publishing”

Each year, our team at Cabells celebrates Peer Review Week (PRW) and recognizes the fact that so much of the work we do each day revolves around peer review, which is the backbone of scholarly communication and the key to maintaining research quality. The theme this year for PRW is “Peer Review and the Future of Publishing,” which would be an appropriate theme every year. To work as intended and as needed, peer review will need to continuously adapt and evolve along with publishing.

The importance of peer review to the quality and overall success of a journal can’t be overstated. For a journal to be recognized in the academic or medical community as legitimate, a robust peer review system must be in place. In recent years, the scholarly community has been shown time and again the results of substandard (or nonexistent) peer review. It has also become clear that identifying an effective and efficient model of peer review has proven to be a challenge for publishers.

Our friend, Daley White, a research scientific editor with the Moffitt Cancer Center, has written an excellent piece discussing the current state of peer review and highlighting a few promising alternative strategies. That piece, along with another by Daley discussing the role of generative artificial intelligence in peer review, should not be missed.

The bedrock of scholarly publishing

At its core, peer review is about benefiting the knowledge base by establishing quality control with respect to published research, which is then used to generate more knowledge. By publishing research papers that have been thoughtfully peer reviewed, academic journals make it possible for researchers around the world to learn about the latest findings in their field. This helps advance knowledge and to foster collaboration amongst researchers. Researchers, funders, and the public all expect that research has been reviewed, is sound, and worthy of being built upon.

Peer review helps to ensure published work is high-quality with findings that are accurate and reliable by helping to identify and correct errors, omissions, and biases. Ultimately, authors are responsible for conducting sound research and not fabricating data or results. Unfortunately, the immense pressure to publish along with the industry’s unwillingness to publish null results, both contribute to making this responsibility an insurmountable challenge for some.

For a journal to be considered for inclusion in Journaltyics, our evaluators must have evidence of a rigorous peer review system.

To be effective, peer review must be unbiased and transparent though the extent to which journals are open about their review process varies. Promoting and expanding transparency and accountability in the research and peer review processes shows readers how the paper was evaluated and helps them understand the reasons for its acceptance or rejection, which helps to build trust in the publication process and the research itself.

Time after time

Can it be assumed that peer review is consistently conducted with the necessary rigor when in most cases it is added to the workload of already very busy and time-strapped reviewers? Most workplaces don’t provide an allowance of time for peer review, and there is no compensation for conducting reviews. So, without incentives, peer review is conducted solely to contribute to a knowledge base that needs to be carefully managed and safeguarded.

Along with pressure on scholars to find the time to conduct reviews, there is pressure on journals to review papers quickly. But can speed be reconciled with quality? Speedy peer review, when taken to an extreme, is an indication of the type of substandard or virtually nonexistent peer review often found in predatory journals.

While it’s important to authors that articles are published in a timely manner (which requires timely peer review), there is a correlation between speed and quality that the industry as a whole is working under. Often, the state of a journals peer review process comes down to which journals have more resources available. Not all journals can swing having an in-house statistician to review research statistics on staff. Training in peer review as part of PhD programs would also be valuable – while early career researchers are very knowledgeable in their fields despite being relatively inexperienced, having ECR’s conduct peer review with no training is less than optimal.

So, this PRW we will consider these and other ideas as we continue our work as champions of peer review – and Cabells team member Clarice continues her work as a member of the PRW Steering Committee. Our work at Cabells will adapt and evolve right along with peer review and publishing into the future. What won’t change is the key role played by peer review in maintaining the quality, transparency, and accountability of research and the integrity of knowledge.

Introducing the All-New Journalytics Academic & Predatory Reports

We have some exciting news to share – a new and improved Journalytics Academic & Predatory Reports platform will soon be here. Our team has been working on multiple updates and enhancements to our tried and true platform that will benefit users in multiple ways. Along with our ongoing addition of new verified and predatory journals, users will experience better search results, new data points and visualizations, increased stability and speed, and more secure logins.

In addition to the visual elements and expanded analytics of this redesign, a key component is the full integration of our Journalytics and Predatory Reports databases. This integration will allow for comprehensive searches that present the full range of publishing opportunities and threats in a given area. Our goal is to facilitate journal discovery and evaluation so our users know the journals and know the risks.

Last month we hosted a webinar to give users a sneak peek at the upcoming changes, which include a new guided search page to jumpstart journal discovery, updated platform and journal card designs, and new data points such as fees and article output. Check out the video below or visit our YouTube channel where you’ll find a time-stamped table of contents in the description for easy navigation to specific points in the video.

A preview of the all-new Journalytics Academic & Predatory Reports.

A fresh look with more data

Guided search

The path to journal discovery now begins on our guided search page, where users can

  • search for a journal by title, ISSN, or keyword
  • access our database of legitimate and predatory journals
  • seamlessly sort verified journals with one of our featured metrics shortcuts
  • jump directly to one of 18 academic disciplines
Our guided search page with shortcuts to journal platforms, featured metric sorting, and disciplines.

New fully-integrated platform

Our redesigned platform now features full integration of verified and predatory journals into the same environment. First rolled out as part of our Journalytics Medicine & Predatory Reports platform, the integration has proven to be an invaluable enhancement. Users can feel confident with fully comprehensive search results that capture both legitimate and deceptive publishing opportunities, and they’ll also have the ability to filter out one or the other with just one click.

NO MORE GUESSWORK: Search results now include both legitimate and predatory journals

Search for publications by title, ISSN, disciplines or other keyword and know that we’ve left nothing to chance – verified and predatory journals each have their own design and data type, making clear whether the journal is listed in Journalytics or Predatory Reports.

Redesigned journal cards with new data points

Judging the quality of a journal, the likelihood of manuscript acceptance, publication timelines, and the potential impact of a journal can be difficult. To assist with clear and confident publication evaluations, we have added a few new data points to verified records to facilitate decision-making:

  • Open Access details – copyrights, archiving, and access details
  • Fees – who pays for publishing articles and how much?
  • Article output – how often does a journal publish per year?
  • We have reminaged the visualization of our CCI citation-backed metric, which shows the historical “influence” or citation-activity for each discipline in which a journal publishes.

The new Journalytics Academic will include the beta version of a new metric: The SDG Impact IntensityTM (SDGII)developed in partnership with the Saint Joseph’s University Haub School of Business.

The SDGII seeks to contextualize and understand the relevance of academic research in terms of the United Nation’s Sustainable Development Goals. Climate change, sustainability, and equity are among the most powerful forces for change in society, and yet they are ignored by traditional citation-based metrics. We hope to help lead the charge to change this dangerous oversight.

This is a pilot program that is currently included in a limited number of business journals but will soon be expanded to increase awareness of sustainability-minded journals publishing impactful research.

For more information, see our videos covering the SDGII on our YouTube channel.

New look, same deceptive operations

In recent years, awareness of the nature and scope of the problem presented by predatory publishers has increased within the scholarly community. At the same time, predatory publishers themselves have become more aware of what they need to do to appear to legitimate and avoid detection. Their tactics are evolving right along with efforts to combat them, and their numbers are growing – we currently have reports on more than 17,000 predatory publications.

  • Journal identification – each report provides the title, publisher, discipline(s), ISSN (if available), and website links for journal discovery and confirmation.
  • Violation categories – journal reports monitor the areas in which the deceptive behaviors occurred.
  • Violation severity – reports also track the severity of the deceptive behaviors.

By providing not just identifying information for predatory journals, but also a report on the nature, scope, and severity of their behaviors, we aim to equip our users with an understanding of the varied tactics predatory publishers employ. Our goal is to educate and inform researchers on the different profiles and archetypes of predatory journals we uncover, so they are better able to identify and avoid them as their career continues.

What’s next?

Be on the lookout for further updates on the timing of the release of the updated platform, as well as information on the upcoming new website that will serve as the central hub for all of our resources, complete with a portal for platform access, links to product information and criteria, and our blog.

We will also host another webinar as we move closer to the launch date for a final look at the upcoming enhancements. Stay tuned!

Originality in Academic Writing: The Blurry Lines Between Similarity, Plagiarism, and Paraphrasing

Across disciplines, most manuscripts submitted to academic journals undergo a plagiarism check as part of the evaluation process. Authors are widely aware of the industry’s intolerance for plagiarism; however, most of us don’t receive any specific education about what plagiarism is or how to avoid it. Here, we’ll discuss what actually constitutes plagiarism, understand the important differences between similarity and plagiarism, and discuss easy strategies to avoid the problem in the first place.

What actually is plagiarism?

The University of Oxford defines plagiarism as “presenting work or ideas from another source as your own… without full acknowledgement.” This is a fairly fundamental definition, and for most of us, this is the extent of our education on plagiarism.

When we think of plagiarism, we usually imagine an author intentionally copying and pasting text from another source. This form of plagiarism, called direct plagiarism, is actually fairly uncommon. More commonly, plagiarism takes the form of:

  • Accidental plagiarism. Citing the wrong source, misquoting, or unintentionally/coincidentally paraphrasing a source that you’ve never seen before is still considered plagiarism, even when done without intent.
  • Secondary source plagiarism. This is an interesting and challenging issue to tackle. This form of plagiarism refers to authors using a secondary source, but citing the works found in that source’s reference list—for example, finding information in a review but citing the initial/primary study instead, not the review itself. This misattribution “fails to give appropriate credit to the work of the authors of a secondary source and… gives a false image of the amount of review that went into research.
  • Self-plagiarism. There’s still an ongoing debate over the acceptability of reusing one’s own previous work, with differing answers depending on the context. It’s widely agreed that reusing the same figure, for example, in multiple articles without correct attribution to the original publication is unethical; however, it’s less clear whether it’s acceptable to use the same verbatim abstract in a manuscript as you previously published as part of a conference poster. Copyright law can play a major factor in the permissibility of self-plagiarism in niche cases.
  • Improper paraphrasing plagiarism.Some of us have heard from teachers or peers that, to avoid plagiarism, all you need to do is “rewrite it in your own words.” However, this can be misleading, as there’s a fine line between proper and improper paraphrasing. Regardless of the specific language used, if the idea or message being communicated is not your own and is not cited, it’s still plagiarism.

Plagiarism vs similarity

Many publishers, including Elsevier, Wiley, and SpringerNature, perform plagiarism evaluations on all manuscripts they publish by using software iThenticate. However, ‘plagiarism evaluation’ through iThenticate is a bit of a misnomer: iThenticate checks for similarity, not plagiarism. Though the difference between the two is minor, their implications are entirely different.

Whereas plagiarism refers to an act of ethical misconduct, similarity refers to any portion of your paper that recognizably matches text found in previously published literature in iThenticate’s content database. Similarity can include text matches in a manuscript’s references, affiliations, and frequently used/standardized terms or language; it’s natural and nonproblematic, unlike plagiarism.

A similarity report, such as the one generated by iThenticate, is used by editors as a tool to investigate potential concerns for plagiarism. If an iThenticate similarity report flags large sections of text as similar to a previously published article, the editors can then use this as a starting point to evaluate whether the article is, in fact, plagiarized.

Strategies to avoid plagiarism

Being accused of plagiarism—especially direct, intentional plagiarism— can be a serious ethical issue, with long-term implications for your job prospects and career. The best way to avoid this issue is to use preventative measures, such as the ones discussed here.

  1. Educate yourself about what plagiarism truly is and how it occurs. If you’ve made it this far in the article, you’re already making great progress! Consider reviewing the sources cited in this article to continue your education.
  2. Start citing from the note-taking phase. Many times, accidental plagiarism is the result of forgetting the source of an idea or statistic during quick, shorthanded note-taking. To avoid this, start building your reference list from the beginning of your research phase. Consider adding a quick citation for every single line of text—including citing yourself for your own ideas! Reference management software like EndNote and Zotero are great tools for this.
  3. Understand proper and improper paraphrasing. Learn how to correctly paraphrase someone else’s work and the importance of citing your paraphrased text. If you come across a phrase of sentence that perfectly summarizes an idea, don’t be afraid to include it as a direct, cited quotation!
  4. Consider cultural differences in plagiarism policies. This article aligns with the United States’ view of plagiarism as a serious ethical offense. However, this isn’t the case in all countries. In some East Asian countries, for example, the concepts of universal knowledge and memorization to indicate respect leads to a wider cultural acceptance of instances that would be considered plagiarism in America. If you’re unsure about what the expectations are for a certain journal, it’s always recommended to ask the editors.
  5. Use a similarity checker. While a similarity review won’t directly identify plagiarism, it can be great as a final scan for any text you may have copied and pasted with the intention of removing but forgot to erase, and it can help pick up accidental plagiarism! If your institution doesn’t have access to iThenticate, Turnitin, or CrossRef, there are several free similarity scanners available online.

Journalytics Update: Twenty Hindawi Journals Recently Removed from the Journalytics Academic and Journalytics Medicine Databases

As part of our ongoing mission to protect and foster research integrity, the following journals from the publisher Hindawi have been removed from our Journalytics Academic and Journalytics Medicine databases for failure to meet our quality criteria, pending re-evaluation of their policies and practices:

  • Advances In Materials Science And Engineering (ISSN: 1687-8434)
  • Biomed Research International (ISSN: 2314-6133)
  • Computational And Mathematical Methods In Medicine (ISSN: 1748-670X)
  • Computational Intelligence And Neuroscience (ISSN: 1687-5265)
  • Contrast Media & Molecular Imaging (ISSN: 1555-4309)
  • Disease Markers (ISSN: 0278-0240)
  • Education Research International (ISSN: 2090-4002)
  • Evidence-Based Complementary And Alternative Medicine (ISSN: 1741-427X)
  • Journal Of Environmental And Public Health (ISSN: 1687-9805)
  • Journal Of Healthcare Engineering (ISSN: 2040-2295)
  • Journal Of Nanomaterials (ISSN: 1687-4110)
  • Journal Of Oncology (ISSN: 1687-8450)
  • Journal of Sensors (ISSN: 1687-725X)
  • Mathematical Problems In Engineering (ISSN: 1024-123X)
  • Mobile Information Systems (ISSN: 1574-017X)
  • Oxidative Medicine And Cellular Longevity (ISSN: 1942-0900)
  • Scanning (ISSN: 0161-0457)
  • Scientific Programming (ISSN: 1058-9244)
  • Security and Communication Networks (ISSN: 1939-0114)
  • Wireless Communications & Mobile Computing (ISSN: 1530-8669)

Wiley’s statement confirming ‘compromised articles’ in Hindawi special issues, coupled with strong evidence that at least some of the retracted content was generated by paper mills, points to the absence of a functional peer review system in place at the above listed journals. The backbone of not just any legitimate, trustworthy journal, but of all of academic and medical publishing, is a robust and closely managed peer review process.

We covered the wave of retraction notices in recent years from scientific and medical publications on our Journalytics Medicine blog in November. Retractions are, to a certain extent, ‘part of the process’ for journals, but retractions at this level by one publisher shows a breakdown in that process. It is our hope that the removal of these journals from our databases will motivate all scholarly and medical publishers to review their current publication processes and make the necessary improvements or changes to any substandard elements.

The Predator Effect: Understanding the Past, Present and Future of Deceptive Academic Journals

During his time working at Cabells, predatory publishing practices turned into a near obsession for Simon Linacre – so much so, he wrote a book about it: The Predator Effect. Here he shares details of the book, and how predatory journals could form part of a publishing ethics crisis.


In a recent conversation with a senior academic regarding publishing ethics, the discussion veered between predatory publishing, paper mills, paraphrasing software and the question of whether an article written by AI could be regarded as an original piece of work. Shaking his head, the academic sighed and exclaimed: “Retirement is starting to look pretty good right now!” The conversation demonstrated what a lot of people in scholarly communications feel right now, which is that at this moment in time, we are losing the arms race when it comes to research integrity and publishing ethics.

In the last year, we have seen the number of predatory journals included on Cabells’ Predatory Report database approach 17,000, thousands of articles be retracted by major publishers such as Wiley and IoP, and major scandals, such as one I worked on with Digital Science company Ripeta, where one author was responsible for dozens of plagiarised articles. The concern is that many more articles might have leaked into the scholarly communications system from paper mills, and this coupled with leaps in technology that enable students and authors to buy essays and articles generated by AI without lifting a finger themselves. Now wonder older scholars who didn’t have to deal with such technologies are shaking their heads in despair.

Negative Impact

These issues can be rather abstract as they don’t necessarily translate into tangible impacts for most people, but this also means they can be misunderstood and underestimated. For example, what happens when an individual reads about a cure in a predatory journal and tries to use it and makes the condition of a patient worse? Or what about someone qualifying for a position based on coursework they cheated on? There are numerous instances where a breakdown in ethics and integrity can cause major problems.

More broadly, the whole fabric of trust that society has in academic research risks being undermined with so many options open to bad actors if they wish to buck the system for their own ends. We have seen this with the fateful Wakefield article about the MMR vaccine in the 1990s, the effects of which are still being felt today. That was an anomaly, but if people ceased to believe that published research was trustworthy because of these numerous threats, then we will indeed be in a perilous position.

Digital Solutions

The scale of these problems can be seen in three recent publications, which I discussed in a recent talk at the ConTech 2022 Conference in London:

  • In September, The State of Trust & Integrity in Research (STIR) report was published by Ripeta, which outlined some of the issues facing research integrity, and how greater standardisation and investment in technology is required
  • In October, the State of Open Data (SoOD) report was published by Figshare, Digital Science and Springer Nature. It produced the results of a huge survey of researchers which showed open data sharing was only growing gradually, and policymaking needed to be more joined up and consistent
  • In November, The Predator Effect was published – a short open access ebook detailing the history and impact of predatory publishing practices. 

While each of these publications offers some sobering findings in terms of the problems faced by scholarly communications, they also offer some hope that technology might provide some solutions in the future. In terms of predatory journals, this means using not only using technology as one solution, but using multiple solutions together in a joined up way. As I say in the book:

“Using technology to improve hygiene factors such as legitimate references may be another strategy that, if adopted together and more widely, will have a significant impact on predatory journal output.” (Linacre, 2022)

Concerns around trust in science are real, but so is the anticipation that technology can show how scholarly communications can move forward. As a former publisher, I thought technology could easily solve the problem, but thanks to working at Cabells I understood much more work is required in equipping researchers with the right tools, knowledge and know how to avoid predatory journals. In the past, collaboration in the industry has often been slow and not fully inclusive, but this will have to change if a breakdown in research integrity and publication ethics is going to be avoided.

Measuring Sustainability Research Impact

Cabells was excited and honored to have the opportunity to take part in the EduData Summit (EDS), which took place at the United Nations in New York City in June. The EDS is the “world’s premium forum for data-driven educators – a platform for strategists, data scientists, CIOs and other dataheads to discuss and share best practices at the intersection of big data, predictive analytics, learning analytics, and education.”

The theme of this year’s Summit was “The Virtuous Circle: Sustainable and inclusive life-long learning through EduData” and sessions focused on topics such as access to education, continued and distance education, innovation in data science and AI, and sustainability.

Cabells CTO Lucas Toutloff was joined by Rachel Martin, Global Sustainability Director at Elsevier, and David Steingard from Saint Joseph’s University’s Haub School of Business for the virtual presentation “Industry-University Collaboration for Impact with the UN SDGs.” The panel discussed the importance of connecting research and science to the United Nations Sustainable Development Goals (SDGs) widely, and more specifically, bridging the gap between researchers and practitioners. The SDGs are 17 interconnected goals spanning a large set of environmental, social, and economic topics and represent a universal call to action for building a more sustainable planet by 2030.  

“Industry-University Collaboration for Impact with the UN SDGs” presented at the EduData Summit at the United Nations, June 2022

Scholarly publishing can steer research and innovation toward the SDGs if we specifically and collectively shift the focus to address these crucial objectives and solutions. Researchers must lead the way by providing solutions for practitioners to put into action. Cabells, as one of the first U.S. organizations and non-primary publishers globally to be awarded membership to the SDG Publishers Compact, along with having the privilege of being part of the Compact’s Fellows Group, is fully invested in helping to leverage the power of scholarly publishing to achieve the SDGs.

The SDG Publisher’s Compact and Fellows Group

The SDG Publisher’s Compact’s core mission is to create practical and actionable recommendations for stakeholders in every corner of academic research – publishers, editors and reviewers, researchers and students, authors and librarians – for how they can have the SDGs at the forefront of their research agenda so we can collectively bridge the gap between researchers and practice.

The goal of the Compact Fellows Group is to encourage all areas of the ecosystem to share in incentivizing researchers to perform work that supports and addresses the SDG and help smooth the transition from research to practice. The Fellows Group has created specific best practices and recommendations for each sector that can be acted upon immediately to drive research into the hands of practitioners. The goal is to incentivize research that is driving innovation to address the SDGs which means we need to have ways to parse through, discover, and measure this research, because “what gets measured gets done.”

A major component in this process is establishing a broad spectrum of reporting and insights to drive incentives and measures of impactful research to gauge how an institution, individual researcher, or journal is performing in terms of SDGs. SDG Publisher’s Compact members have a responsibility to drive research to action and impact and devise ways to measure its effectiveness, reward those who conduct and publish impactful research in impactful journals, and continue to encourage those who don’t.

The SDG Impact Intensity Journal Rating

Toward this end, and in the spirit of SDG 17 “Partnerships for the Goals,” we are working with SJU on a publisher-neutral, AI-driven academic journals rating system assessing scholarly impact on the SDGs, called the SDG Impact IntensityTM (SDGII) journal rating. Data, scholarship, and science will be the driving forces for meeting the 2030 goal and as SDG research output is increasing, funders, universities, and commercial and not-for-profit organizations need to know money, time, and research is being well spent and having an impact.

We have discussed (here and here) our commitment to doing our part to advance progress on meeting the SDGs and, ultimately, the 2030 Agenda. Our work with Professor Steingard and his team from SJU in developing the SDGII to help business schools determine the impact their research is having on society by addressing global crises has been some of our most rewarding work. Working within the business school ecosystem, we’re examining how the SDGs can inspire a transformation from quality to impact in business by looking at journals in terms of their alignment and taxonomy connection to the SDGs.

The top 50 business school journals (according to Financial Times in 2016) were examined by the United Nations PRME group and it was discovered that only 2.8% of articles published in ‘top-tier’ journals address challenges such as poverty, climate change, clean energy, water, equality, etc. This is a problem that continues today, many of the same journals are still among the top in business journal rankings and they are not championing and featuring impactful research to any meaningful and impactful degree.

Cabells and SJU are trying to address this problem through the SDGII by shifting the philosophy on what “counts” when looking at business journals and noting which publications are driving impact with respect to the SDGs. We are working to integrate, promote, and ultimately change the benchmarks of what matters in academic output and the data that drives decision-making.

To continue to promote this initiative and encourage the shift from quality to impact, we were thrilled to have the opportunity to discuss our progress at the AACSB’s International Conference and Annual Meeting (ICAM), in April in New Orleans.

Sustainability is the crisis of our generation, and sustainability‑mindedness has been an important point in academic research. The SDGII is designed to give stakeholders on every level the ability to measure what they’re doing and to serve as a cross‑motivational tool to drive the industry forward on issues of sustainability. As mentioned earlier, when it comes to incentives, what gets measured gets done. The traditional metrics of evaluating the quality of research journals focus mainly on citation intensity which evaluates journals based on how much they are used and cited. While this makes sense on some level, research must be read to have an impact after all, it’s missing the mark by not considering, and measuring, impact on SDGs.

The SDGII is an alternative, complementary metric that will evaluate a journals SDG research and output through artificial intelligence and machine learning and build a profile for the publication to demonstrate its impact on these issues. Rather than throw out the traditional approach of evaluating quality and value of a journal, we are seeking to build on the foundation that good journals have in terms of things like scholarly rigor, audience, citations, and rankings. We want to move the needle to highlight research and journals that address the SDGs and the SDGII will help business schools demonstrate how their research is achieving societal impact and meeting the Global Goals.

Update: A Journal Hijacking

Editor’s Note: This is an updated version of an article originally posted in August, 2021.


As members of our journal evaluation team work their way around the universe of academic and medical publications, one of the more brazen and egregious predatory publishing scams they encounter is the hijacked, or cloned, journal.  One recent case of this scheme uncovered by our team, while frustrating in its flagrance, also offered some levity by way of its ineptitude. But make no mistake, hijacked journals are one of the more nefarious and injurious operations carried out by predatory publishers. They cause extensive damage not just to the legitimate journal that has had its name and brand stolen, but to medical and academic research at large, their respective communities of researchers and funders, and, ultimately, society.

There are a few different variations on the hijacked journal, but all include a counterfeit operation stealing the title, branding, ISSN, and/or domain name of a legitimate journal to create a duplicate, fraudulent version of the same. They do this to lure unsuspecting (or not) researchers into submitting their manuscripts (on any topic, not just those covered by the original, legitimate publication) for promises of rapid publication for a fee.

A recent case of journal hijacking investigated by our team involved the legitimate journal, Tierärztliche Praxis, a veterinary journal out of Germany with two series, one for small and one for large animal practitioners:

The legitimate website for Tierärztliche Praxis

by this counterfeit operation, using the same name:

The website for the hijacked version of Tierärztliche Praxis

One of the more immediate problems caused by cloned journals is how difficult they make it for scholars to discover and engage with the legitimate journal, as shown in the image below of Google search results for “Tierärztliche Praxis.” The first several search results refer to the fake journal, including the top result which links to the fake journal homepage:

“Tierärztliche praxis” translates to “veterinary practice” in English, and the legitimate journal is of course aimed at veterinary practitioners. Not so for the fake Tierärztliche Praxis “journal” (whose “publishers” didn’t bother/don’t care to find out what “tierärztliche” translates to) which claims to be a multidisciplinary journal covering all subjects and will accept articles on anything by anyone willing to pay to be published:

Aside from a few of the more obvious signs of deception found with the cloned journal: a poor website with duplicate text and poor grammar, an overly simple submission process, no consideration of the range of topics covered, to name a few, this journal’s “archive” of (stolen) articles takes things to a new level:

Above: the original article, stolen from Tuexenia vs. the hijacked version

A few things to note:

  • The stolen article shown in the pictures above is not even from the original journal that is being hijacked, but from a completely different journal, Tuexenia.
  • The white rectangle near the top left of the page to cover the original journal’s title and the poorly superimposed hijacked journal title and ISSN at the header of the pages, and the volume information and page number in the footer (without bothering to redact the original article page numbers).
  • The FINGER at the bottom left of just about every other page of this stolen article.
Predatory Reports listing for the hijacked version of Tierärztliche Praxis

Sadly, not all hijacked or otherwise predatory journals are this easy to spot. Medical and academic researchers must be hyper-vigilant when it comes to selecting a publication to which they submit their work. Refer to Cabells Predatory Reports criteria to become familiar with the tactics used by predatory publishers. Look at journal websites with a critical eye and be mindful of some of the more obvious red flags such as promises of fast publication, no information on the peer review process, dead links or poor grammar on the website, or pictures (with or without fingers) of obviously altered articles in the journal archives.

Seeds of Change

If you plan on attending the Society for Scholarly Publishing’s (SSP) 44th Annual Meeting next month in Chicago, be sure to make time to attend Session 4F, “Open Science and SDGs: Harnessing Open Science to Address Global Issues.” Lucas Toutloff, CTO at Cabells, will be part of an outstanding panel that will be discussing ways the scientific community and journalism can drive change and wider societal outreach through open science policies and by embracing SDGs as a key topic in research impact.

Over the past year we have written extensively about our commitment to doing our part to move the UN Sustainable Development Goals (SDGs) and, ultimately, their 2030 Agenda for Sustainable Development, forward. We were proud to join the SDG Publishers Compact as one of the first U.S. organizations and non-primary publishers globally to be awarded membership, and we look forward to becoming more involved in the rankings, ratings, and assessments HESI action group, tasked with guiding the changes to the criteria for assessment of the performance of higher education institutions to include contributions to the UN SDGs.

We’ve also been thrilled at the growth of and excitement for the SDG Impact Intensity™ (SDGII) academic journal rating, the first system for evaluating how journals contribute to positively impacting the SDGs. The SDGII is the result of our collaboration with Dr. David Steingard, Director of the SDG Dashboard initiative and Associate Professor of Leadership, Ethics, & Organizational Sustainability at the Haub School of Business at Saint Joseph’s University, and his team of researchers.

The SDGII uses SJU’s AI-based methodology to look at article output in journals from Cabells’ Journalytics database and gives those journals a ranking determined by the relative focus they have exhibited in their article publications over the last five years with respect to the SDGs. The SDGII provides a rating of up to five ‘SDG wheels’ to summarize the SDG relevance of articles published over a five-year period (2016-2020).

Last month, we had the chance to champion the potential benefits and impact of the SDGII at the Principles for Responsible Management Education (PRME) North American Biennial Meeting in Virginia, and at the AACSB’s International Conference and Annual Meeting (ICAM) in New Orleans. David and his team discussed their vision and efforts to inspire a transformation from “quality” to “impact” in academic publications.

From right to left: Dr. Julia Christensen Hughes, Dr. Kathleen Rodenburg, and Dr. David Steinberg speak at the PRME 2022 Biennial Meeting at George Mason University in Arlington, VA.

At PRME, we discussed how impact-focused metrics can support progressive publication and business education agendas and unveiled a new iteration of the metric – the SDGII 3000, which provides a rating to measure the SDG-intensity of 3000 academic business journals, as well as the net impact of a business school’s faculty on publications advancing the SDGs. The SDGII 3000 will analyze 95%+ of all relevant business school and SDG-related journals where faculty publish and represents a massive expansion of the measurement of the social and environmental impact of publications through the SDGs.  

Dr. David Steingard presents the SDGII 3000 for the first time at PRME.

We look forward to continuing this discussion in Chicago at the SSP conference, both during our session and beyond. We will discuss the ways that open science is impacting SDG initiatives and programs and explore methods for operationalizing SDG-mindedness as a tool for measuring both research impact and potential. The momentum is building for this game-changing initiative and we hope to see continued interest and excitement from all corners of academia.

SDGs and the Higher Education Sustainability Initiative: The Way Forward


The 17 integrated UN Sustainable Development Goals (SDGs) are a global call to action to end poverty, protect the planet, and ensure that by 2030 all people enjoy peace and prosperity. Research and higher education will play vital roles in society’s march toward achieving the SDGs by the end of the decade and in building a sustainable future by providing current and future stakeholders with the knowledge, skills, and ethos to make informed and effective decisions to this end.

The Higher Education Sustainability Initiative (HESI) is a partnership that gathers over two dozen UN agency members and Higher Education Sustainability Networks. The Initiative tackles the most crucial challenges of our time by redesigning higher education to provide leadership on education for sustainable development, spearheading efforts to ‘green’ campuses, and supporting sustainable efforts in communities, while also ensuring the quality of education, equity, and gender equality.

Initiated in 2012 leading up to the Rio+20 conference, and bolstered with support of the United Nations, HESI provides higher education institutions with a vibrant confluence of higher education, science, and policymaking by enhancing awareness of higher education’s role in supporting sustainable development, facilitating multi-stakeholder discussions and action, and sharing best practices. The Initiative emphasizes the crucial role that higher education plays in educating the current and next generation of leaders, propelling the research agenda for public and private sectors, and helping to shape the path of national economies.

HESI also aims to directly address the problem of aligning research programs and outcomes in scholarly publications. By highlighting those journals that are already focused on this alignment – and those that could do better – Cabells and Saint Joseph’s University are hoping to play a big part in facilitating this process.

One of the overall goals of Cabells is to optimize decision making for both researchers and institutions. The SDGs are becoming increasingly important to these groups, and we strive to support them in enhancing the impact of the work they’re doing. One way we’ve been able to do this is through our collaboration with Saint Joseph’s University and Dr. David Steingard, developers of the SDG Dashboard at Saint Joseph’s University, to create a new metric called the SDG Impact Intensity™ (SDGII) journal rating. The SDGII seeks to contextualize and understand the relevance of academic research in terms of the SDGs. Climate change, sustainability, and equity are among the most powerful forces for change in society, and yet they are ignored by traditional citation-based metrics.

The SDG Impact Intensity uses a sophisticated AI methodology from SJU to look at article output in journals from Cabells’ Journalytics database and gives those journals a ranking determined by the relative focus they have exhibited in their article publications over the last five years with respect to the SDGs. The SDGII provides a rating of up to five ‘SDG wheels’ to summarize the SDG relevance of articles published over a five-year period (2016-2020).

As previously discussed in The Source, the SDGII show that journals well-known for perceived academic quality in business and management performed badly when assessed for SDG relevance, while journals focused on sustainability issues performed much better.

We believe our work with SJU and Dr. Steingard will be a key collaboration within the industry and its work on the SDGs, and we’ve joined the SDG Publishers Compact (Cabells was proud to be named the Compact’s member of the month for December 2021) to help further this partnership and the pursuit of the SDGs. In the coming months, Cabells and Dr. Steingard will be on hand at the upcoming PRME, AACSB, and SSP annual meetings to discuss a new iteration of the metric and lead discussions on how impact-focused metrics can support a progressive publication agenda. Greater than a change in perspective, there is an ongoing paradigm shift occurring as the value of journals moves past ideas of quality based largely on citations, reputation lists, and prestige, to impact and mission-driven research outputs.

One, Two, Three… Blog!

It is a little over three years since Cabells launched its blog The Source, and over 100 articles later it is still here dispensing wisdom on publication ethics, scholarly communications, and even the odd cartoon character. Simon Linacre reviews the good, the bad, and the ugly from the last 1,000 days and counting…


A quick look at the tag cloud at the bottom of this blog tells you everything you need to know about the main topic of conversation that has dominated its content for the last three or so years. While the number of predatory journals appearing and being identified in Cabells’ Predatory reports shows no sign of abating – 15,715 and counting – it is a topic that always generates the most interest among readers. Part of this fascination, I think, is that for many of us law-abiding citizens, coming face to face with actual crime and misdemeanors happens relatively rarely in our lives, But with every unwanted spam email we receive we are up close and personal with actual criminality in action.

Posts concerning predatory publishing that have garnered most interest – and this is replicated in the many webinars that Cabells delivers globally – tend to cover practical advice on avoiding predatory journals, as well as the wackier side of the phenomenon. For example, the post in 2019 that featured a journal with Yosemite Sam from Yale on one journal’s Editorial Board attracted a lot of attention, as did an article last year answering common questions about predatory journals. Despite the widespread coverage in academic journals and wider media, the topic still holds huge interest for all stakeholders in academia.

Other topics that have also been popular have focused on ‘how to…’ guidance, such as the latest criteria used to identify journals for inclusion in the Predatory Reports database and an ‘A to Z’ of predatory publishing in 2020. This perhaps highlights there is still great uncertainty amongst the many authors, librarians and publishers who read the blog about how to navigate the predatory journal landscape.

More recently, posts about hijacking journals and various issues highlighted in scholarly journals on wider issues of publication ethics have also garnered significant interest, with growing threats such as paper mills worrying many academics. Indeed, reflecting on the 100+ posts shared on the blog, there does seem to be a disproportionately large number of topics on bleak topics such as climate change, threats to academic freedoms and lack of research funding. However, some positive items have shone through and inspired a good deal of response and hope amidst the gloom. Chief among these is the work being done by Cabells and others to highlight the increasing engagement research reported in academic journals is contributing toward the United Nations’ Sustainable Development Goals (SDGs). In addition to Cabells’ pilot collaboration to create a new metric, one of the most viewed recent posts was on how this ‘new perspective’ could change the entrenched paradigms of research publications for the better. Such interest in new ideas and positive change offers a glimpse of a more open and collaborative future, one that is not mired in scandal and tired thinking. There is much, then, to look forward to in The Source over the next three years and hundred posts.