Guest Post: A look at citation activity of predatory marketing journals

This week we are pleased to feature a guest post from Dr. Salim Moussa, Assistant Professor of Marketing at ISEAH at the University of Gafsa in Tunisia. Dr. Moussa has recently published insightful research on the impact predatory journals have had on the discipline of marketing and, together with Cabells’ Simon Linacre, has some cautionary words for his fellow researchers in that area.

Academic journals are important to marketing scholars for two main reasons: (a) journals are the primary medium through which they transmit/receive scholarly knowledge; and (b) tenure, promotion, and grant decisions depend mostly on the journals in which they have published. Selecting the right journal to which one would like to submit a manuscript is thus a crucial decision. Furthermore, the overabundance of academic marketing journals -and the increasing “Publish or Perish” pressure – makes this decision even more difficult.

The “market” of marketing journals is extremely broad, with Cabells’ Journalytics indexing 965 publication venues that that are associated with “marketing” in their aims and scope. While monitoring the market of marketing journals for the last ten years, I have noticed that a new type of journal has tapped into it: open access (OA) journals.

The first time I have ever heard about OA journals was during a dinner in an international marketing conference held in April 2015 in my country, Tunisia. Many of the colleagues at the dinner table were enthusiastic about having secured publications in a “new” marketing journal published by “IBIMA”. Back in my hometown (Gafsa), I took a quick look at IBIMA Publishing’s website. The thing that I remember the most from that visit is that IBIMA’s website looked odd to me. Then a few years later, while conducting some research on marketing journals, I noticed some puzzling results for a particular journal. Investigating the case of that journal, I came to realize that a scam journal was brandjacking the identity of the flagship journal of the UK-based Academy of Marketing’s, Journal of Marketing Management.

Undertaking this research, terms such“Predatory publishers”, “Beall’s List”, and “Think, Check, Submit” were new discoveries for me. This was also the trigger point of a painful yet insightful research experience that lasted an entire year (from May 2019 to May 2020).

Beall’s list was no longer available (shutdown in January 2017), and I had no access to Cabells’ Predatory Reports. Freely available lists were either outdated or too specialized (mainly Science, Technology, and Medicine) to be useful. So, I searched for journals that have titles that are identical or confusingly similar to those of well-known, prestigious, non-predatory marketing journals. Using this procedure, I identified 12 journals and then visited the websites of each of these 12 journals to collect information about both the publisher and the journal; that is, is the journal OA or not, its Article Processing Charges, whether the journal had an editor in chief or not, the names of its review board members and their affiliations (if any), the journal’s ISSNs, etc. I even emailed an eminent marketing scholar that I was stunned to see his name included in the editorial board of a suspicious journal.

With one journal discarded, I had a list of 11 suspicious journals (Journal A to Journal K).

Having identified the 11 publishers of these 11 journals, I then consulted three freely available and up-to-date lists of predatory publishers: the Dolos List, the Kscien List, and the Stop Predatory Journals List. The aim of consulting these lists was to check whether I was wrong or right in qualifying these publishers as predatory. The verdict was unequivocal; each of the 11 publishers were listed in all three of them. These three lists, however, provided no reasons for the inclusion of a particular publisher or a particular journal.

To double-check the list, I used the Directory of Open Access Journals, which is a community-curated online directory that indexes and provides access to high-quality, OA, peer-reviewed journals. None of the 11 journals were indexed in it. To triple-check the list, I used both the 2019 Journal Quality List of the Australian Business Deans Council and the 2018 Academic Journal Guide by the Chartered Association of Business Schools. None of the 11 journals were ranked in these two lists either.

To be brief, the one year of endeavor resulted in a paper I submitted to the prestigious academic journal, Scientometrics, published by Springer Nature, and my paper was accepted and published online in late October 2020. In that paper, I reported the findings of a study that examined the extent of citations received by articles published in ten predatory marketing journals (as one of the 11 journals under scrutiny was an “empty journal”; that is, with no archives). The results indicated that some of these journals received quite a few citations with a median of 490 citations, with one journal receiving 6,296 citations (see also Case Study below).

I entitled the article “Citation contagion: A citation analysis of selected predatory marketing journals.” Some people may or may not like the framing in terms of “contagion” and “contamination” (especially in these COVID times), but I wanted the title to be striking enough to attract more readership. Those who read the article may see it as a call for marketing researchers to “not submit their (possibly interesting) knowledge products to any journal before checking that the publication outlet they are submitting to is a non-predatory journal.” Assuming that the number of citations an article receives signals its quality, the findings in my study indicate that some of the articles published in these predatory journals deserved better publication venues. I believe that most of the authors of these articles were well-intentioned and did not know that the journals they were submitting to were predatory. 

A few months earlier and having no access to Cabells databases, I read each of the posts in their blog trying identify marketing journals that were indexed in Predatory Reports. Together with Cabells, our message to the marketing research community is that 10 of the 11 journals that I have investigated were already listed (or under-review for inclusion) in Predatory Reports. I believe my study has revealed only the tip of the iceberg. Predatory Reports now indexes 140 journals related to the subject of Marketing (which represents 1% of the total number of journals listed in Predatory Reports). Before submitting your papers to an OA marketing journal, you can use Predatory Reports to verify that it is legitimate.

Case Study

The study completed by Dr. Moussa provides an excellent primer on how to research and identify predatory journals (writes Simon Linacre). As such, it is instructive to look at one of the journals highlighted in Dr. Moussa’s article in more detail.

Dr. Moussa rightly suspected that the British Journal of Marketing Studies looked suspicious due to its familiar-sounding title. This is a well-used strategy by predatory publishers to deceive authors who do not make themselves familiar with the original journal. In this case, the British Journal of Marketing Studies sounds similar to a number of potential journals in this subject discipline.

As Dr. Moussa also points out, a questionable journal’s website will often fail to stand up to a critical eye. For example, the picture below shows the “offices” of BJMS – a small terraced house in Southern England, which seems an unlikely location for an international publishing house. This journal’s website also contains a number of other tells that while not singularly defining of predatory publishers, certainly provide indicators: prominent phone numbers, reference to an ‘Impact Factor’ (not from Clarivate), fake indexation in databases (eg DOAJ), no editor contact details, and/or fake editor identity.

What is really interesting about Dr. Moussa’s piece is his investigation of citation activity. We can see from the data below that ‘Journal I’ (which is the British Journal of Marketing Studies) that both total citations and the most citations received by a single article are significant, and represent what is known as ‘citation leakage’ where citations are made to and from predatory journals. As articles in these journals are unlikely to have had any peer review, publication ethics checks or proof checks, their content is unreliable and skews citation data for reputable research and journals.

  • Predatory journal: Journal I (BJMS)
  • Total number of citations received: 1,331
  • Number of citations received by the most cited article: 99
  • The most cited article was published in: 2014
  • Number of citations received from SSCI-indexed journals: 3
  • Number of citations received from FT50 listed journals: 0
Predatory Reports entry for BJMS

It is a familiar refrain from The Source, but it bears repeating – as an author you should do due diligence on where you publish your work and ‘research your research’. Using your skills as a researcher for publication and not just what you want to publish will save a huge amount of pain in the future, both for avoiding the bad journals and choosing the good ones.

Cabells and Inera present free webinar: Flagging Predatory Journals to Fight “Citation Contamination”

Cabells and Inera are excited to co-sponsor the free on-demand webinar “Flagging Predatory Journals to Fight ‘Citation Contamination'” now available to stream via SSP OnDemand. Originally designed as a sponsored session for the 2020 SSP Annual Meeting, this webinar is presented by Kathleen Berryman of Cabells and Liz Blake of Inera, with assistance from Bruce Rosenblum and Sylvia Izzo Hunter, also from Inera.

The webinar outlines an innovative collaborative solution to the problem of “citation contamination”—citations to content published in predatory journals with a variety of bad publication practices, such as invented editorial boards and lack of peer review, hiding in plain sight in authors’ bibliographies.

DON’T MISS IT! On Thursday, November 12, 2020, at 11:00 am Eastern / 8:00 am Pacific, Kathleen and Liz will host a live screening with real-time chat followed by a Q&A!

For relevant background reading on this topic, we recommend these Scholarly Kitchen posts:

The A-Z’s of predatory publishing

Earlier this year Cabells (@CabellsPublish) published an A-Z list of issues regarding predatory publishing practices, with one Tweet a week going through the entire alphabet. In this week’s blog, Simon Linacre republishes all 26 tweets in one place as a primer on how to successfully deal with the phenomenon

A = American. US probable source of #PredatoryJournals activity as it lends credence to claims of legitimacy a journal may adopt to hoodwink authors into submitting articles #PredatoryAlphabet #PredatoryJournal #PublicationEthics

B = British. Questionable journals often use ‘British’ as a proxy for quality. Over 600 include this descriptor in the Predatory Reports database, many more than in the Journalytics database of recommended journals

C = Conferences. Predatory conferences bear the same hallmarks as #PredatoryJournals – few named academics involved, too-good-to-be-true fees & poorly worded/designed websites. Some tips here.

D = Dual Publication. Publishing in #PredatoryJournals can lead to dual publication if the same article is then published in a legitimate journal

E = Editor (or lack of). Always check if a journal has an Editor, with an institutional email address and corroborated affiliation. Lack of an Editor or their details can indicate poor quality or predatory nature

F = Font. Look at fonts used by journals in emails or on web pages – clashing colors, outsized lettering or mixed up fonts may indicate predatory behavior

G = Germany, which takes #PredatoryJournals seriously through university-level checks, highlighting the issue and exposing problems in a 2018 investigation

H = H-Index. Authors’ and journals’ #H_index scores can be skewed by #PredatoryJournal activity

I = ISSN. Over 4,000 out of 13,000 (30%) journals on @CabellsPublish Predatory Reports database claim an ISSN (some genuine, some fake)

J = JIF. Always check the Journal Impact Factor (JIF) claims on journal websites as many #PredatoryJournals falsely claim them. Check via @clarivate Master Journal List

K = Knowledge. Spotting #PredatoryJournals and recognising legitimate #Journals can be important as subject knowledge for publication. ‘Research your research’ by using tools such as @CabellsPublish databases

L = Legitimacy. First responsibility for authors is to check legitimacy of journals – use checklists, peer recommendations and @CabellsPublish Predatory Reports database

M = Membership scams. Beware any journal that offers membership of a group as a condition of publication. Check existence of group and journal credentials on @CabellsPublish Journal databases

N = Nature. Using a trusted scholarly brand such as @nature can help identify, understand and define #PredatoryJournals, with dozens of articles on the subject via @NatureNews

O = OMICS. In 2019, the FTC fined publisher OMICS over $50m for deceptive publishing practices

P = Predatory Reports. The new name for the @CabellsPublish database of 13,400 #PredatoryJournals

Q = Quick publication. Peer review is a long process  typically lasting months. Beware journals offering quick #PeerReview and publication in days or even weeks

R = Research. Academic #Research should also include research on #Journals to enable #PublicationEthics and #ResearchIntegrity are followed. Use @CabellsPublish Predatory Reports to support your research

S = Spam. Journal emails to solicit articles are often predatory (see below for ironic example). Authors can check the legitimacy of journals by using the @CabellsPublish Predatory Reports database

T = Top Tips. Use our research on #PredatoryJournals and how to spot #PredatoryPublishing practices

U = Unknown. Never trust your research to any unknown editor, journal or publisher #PredatoryJournals #PredatoryAlphabet via @CabellsPublish

V = Vocabulary. Look out for ‘unusual’ vocabulary in predatory journal solicitation emails (i.e., “…your precious paper…”) #PredatoryJournals #PredatoryAlphabet via
@CabellsPublish

W = Website. Look for red flags on a journal’s website such as dead links, poor grammar/spelling, no address for journal or publisher, or no contact info (just web-form)

X = There are no current ISSNs with more than one ‘X’ at the end, which serves as a check digit. Be aware of fake ISSNs with multiple X’s

Y = Yosemite Sam. Always check the backgrounds of Editors and Editorial Board members of journals – if they are a cartoon character, maybe don’t submit your article

Z = Zoology. The US Open Zoology Journal is one of 35 zoology titles on the @CabellsPublish Predatory Reports database. It is not open, is not from the US and has no articles, but will accept your $300 to submit an article

Cabells’ top 7 palpable points about predatory publishing practices

In his latest post, Simon Linacre looks at some new stats collated from the Cabells Predatory Reports database that should help inform and educate researchers, better equipping them to evade the clutches of predatory journals.


In recent weeks Cabells has been delighted to work with both The Economist and Nature Index to highlight some of the major issues for scholarly communication that predatory publishing practices represent. As part of the research for both pieces, a number of facts have been uncovered that not only help us understand the issues inherent in this malpractice much better, but should also point researchers away from some of the sadly typical behaviors we have come to expect.

So, for your perusing pleasure, here are Cabells’ Top 7 Palpable Points about Predatory Publishing Practices:

  1. There are now 13,500 predatory journals listed in the Predatory Reports database, which is currently growing by approximately 2,000 journals a year
  2. Over 4,300 journals claim to publish articles in the medical field (this includes multidisciplinary journals) – that’s a third of the journals in Predatory Reports. By discipline, medical and biological sciences have many more predatory journals than other disciplines
  3. Almost 700 journals in Predatory Reports start with ‘British’ (5.2%), while just 50 do on the Journalytics database (0.4%). Predatory journals often call themselves American, British or European to appear well established and legitimate, when in reality relatively few good quality journals have countries or regions in their titles
  4. There are over 5,300 journals listed in Predatory Reports with an ISSN (40%), although many of these are copied, faked, or simply made up. Having an ISSN is not a guarantee of legitimacy for journals
  5. Around 41% of Predatory Reports journals are based in the US, purport to be from the US, or are suspected of being from the US, based on information on journal websites and Cabells’ investigations. This is the highest count for any country, but only a fraction will really have their base in North America
  6. The average predatory journal publishes about 50 articles a year according to recent research from Bo-Christer Björk of the Hanken School of Economics in Helsinki, less than half the output of a legitimate title. Furthermore, around 60% of papers in such journals receive no future citations, compared with 10% of those in reliable ones
  7. Finally, it is worth noting that while we are in the throes of the Coronavirus pandemic, there are 41 journals listed on in Predatory Reports (0.3%) specifically focused on epidemiology and another 35 on virology (0.6% in total). There could be further growth over the next 12 months, so researchers in these areas should be particularly careful now about where they submit their papers.

Reversal of fortune

One of the most common questions Cabells is asked about its Predatory Reports database of journals is whether it has ever “changed its mind” about listing a journal. As Simon Linacre reports, it is less a question of changing the outcome of a decision, but more of a leopard changing its spots.


This week saw the annual release of Journal Impact Factors from Clarivate Analytics, and along with it the rather less august list of journals whose Impact Factors have been suppressed in Web of Science. This year there were 33 journals suspended, all of which for “anomalous citation patterns found in the 2019 citation data” which pertained to high levels of self-citation. Such a result is the worst nightmare for a publisher, as while they can be due to gaming citation levels, they can also sometimes reflect the niche nature of a subject area, or other anomalies about a journal.

Sometimes the decision can be changed, although it is often a year or two before the data can prove a journal has changed its ways. Similarly, Cabells offers a review process for every journal it lists in its Predatory Reports database, and when I arrived at the company in 2018, like many people one of the first things I asked was: has Cabells ever had a successful review to delist a journal?

Open for debate

The answer is yes, but the details of those cases are quite instructive as to why journals are included on the database in the first place, and perhaps more importantly whey they are not. Firstly, however, some context. It is three years since the Predatory Reports database was first launched, and in that time almost 13,500 journals have been included. Each journal has a link next to the violations on its report for anyone associated with that journal to view the policy and appeal the decision:

1a

This policy clearly states:

The Cabells Review Board will consider Predatory Journal appeals with a frequency of one appeal request per year, per journal. Publications in Predatory Reports, those with unacceptable practices, are encouraged to amend their procedures to comply with accepted industry standards.

Since 2017, there have been just 20 appeals against decisions to list journals in Predatory Reports (0.15% of all listed journals), and only three have been successful (0.02%). In the first case (Journal A), the journal’s peer review processes were checked and it was determined that some peer reviews were being completed, albeit very lightly. In addition, Cabells’ investigators found a previous example of dual publication. However, following the listing, the journal dealt with the problems and retracted the article it had published as it seemed the author had submitted two identical articles simultaneously. This in turn led to Cabells revising its evaluations so that particular violation does not penalize journals for something where an author was to blame.

In the second review (Journal B), Cabells evaluated the journal’s peer review process and found that it was also not completing full peer reviews and had a number of other issues. It displayed metrics in a misleading way, lacked editorial policies on its website and did not have a process for plagiarism screening. After its listing in PR, the journal’s publisher fixed the misleading elements on its website and demonstrated improvements to its editorial processes. In this second case, it was clear that the journal’s practices were misleading and deceptive, but they chose to change and improve their practices.”

Finally, a third journal (Journal C) has just had a successful appeal completed. In this case, there were several problems that the journal was able to correct by being more transparent on its website. It added or cleared up confusion about the necessary policies and made information about its author fees available. Cabells was also able to evaluate its peer review process after it submitted peer review notes on a few articles and it was evident the journal editor was managing a good quality peer review, hence it has now been removed from the Predatory Reports database (it should be noted that, as with the other two successful appeals, journals removed from Predatory Reports are not then automatically included in the Cabells Journalytics database).

Learning curve

Cabells’ takeaway from all of these successful reviews was they were indeed successful – they showed that the original identification was correct, and they enabled improvements that identified them as better, and certainly non-predatory, journals. They also fed into the continuing improvement Cabells seeks in refining its Predatory Reports criteria, with a further update due to be published later this summer.

There are also things to learn from unsuccessful reviews. In one case a publisher appealed a number of its journals that were included on Predatory Reports. However, their appeal only highlighted how bad the journals actually were. Indeed, an in-depth review of each journal not only uncovered new violations that were subsequently added to the journals, but also led to the addition of a brand new violation that is to be included in the upcoming revision of the Predatory Reports criteria.

The scientific predator has evolved – here’s how you can fight back

Today’s post was written by Simon Linacre, Director of International Marketing and Development at Cabells, and Irfan Syed, Senior Writer and Editor at Editage Insights.


How do you identify a predatory journal? Easy, look up your spam folder – say seasoned researchers.

Actually, this ‘initial indicator’ is often the key to identifying a predatory journal. Predatory publishers send researchers frequent emails soliciting manuscripts and promising acceptance – messages that, thanks to the email service provider’s parameters, usually go straight to junk mail. Some cleverly disguised ones do make it to the inbox though, and sometimes, unwary researchers click one of these mails, unleashing the predator and an all-too-familiar sequence of events: Researcher sends manuscript. Receives quick acceptance often without a peer review. Signs off copyright. Receives a staggeringly large invoice. Is unable to pay. Asks to withdraw. Receives equally heavy withdrawal invoice – and threats. The cycle continues, the publisher getting incrementally coercive, the researcher increasingly frustrated.

What makes a predator

The term predatory journal was coined by Jeffrey Beall, former Scholarly Initiatives Librarian at the University of Denver, Colorado, in 2010, when he launched his eponymous list (now archived) of fake scientific journals, with an aim to educate the scientific community. The term was supposed to mirror the guile of carnivores in the wild: targeting the weak, launching a surprise ambush, and effecting a merciless finish.

A more academic definition might be: “Entities that prioritize self-interest at the expense of scholarship and are characterized by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggressive and indiscriminate solicitation practices.” In other words, journals that put commerce before science.

Dubious scientific journals have existed since the 1980s. They were born to lend an easy passage out of the arduous road to acceptance laid by top-rung journals. Recently, they have received a boost from the rise of the open access (OA) movement, which seeks to shift the balance of power towards the researcher. However, with revenues now accruing from the author side, new researchers pressured by a ‘publish or perish’ culture have proved easy targets for predatory publishers that exploit the OA publishing model.

The new face of predation

Today, academia faces another threat, a new predator in scientific communications – predatory author services. The dangers of using predatory author services can be just as acute as those of predatory journals. Authors who pay for such services are risking the abuse of any funding they have received by in turn funding potentially criminal activity. Such predatory author services may not be equipped to make quality edits to an author’s paper – incorrect edits, changes in the author’s intended meaning, and unidentified errors may adversely affect the author’s manuscript. Many authors choose such services to improve their articles and increase their chances of acceptance in high-quality journals, but they are very likely to be disappointed in light of the quality of services they receive.

So, the issue of predatory author services is just as problematic as it is with predatory journals. Despite the efforts of industry bodies such as COPE, it seems there are new players entering the market and even advertising on social media platforms such as Facebook. More worryingly, examples of these predatory services seem to include a veneer of sophistication hitherto not seen before, including well-designed websites, live online chat features, and direct calling.

Spotting a predatory author service

The good news is that these services bear many of the traits of predatory journals, and can be identified with a little background research. Here are some tips on how to separate predatory author services from professional operations such as Cactus’ Editage:

  • Check the English: For a legitimate journal to have spelling or grammar errors on its site or published articles would be a heinous crime, but this should go double for an author services provider. So, beyond the slick graphics and smiling model faces, check if everything is as it should appear by a thorough check of the English
  • Click the links: Dead links, links that loop back to the homepage, or links that don’t match the text should further raise your suspicion
  • Research the partnerships: If a provider genuinely works with Web of Science, Scopus, and The Lancet, there should be evidence of that rather mere logos copied and pasted onto the homepage. Search online for these publicized partnerships to know if they are genuine
  • Look up the provenance: Many predatory operators leave no address at all. Some though will choose to include a fake address (which turns out to be a long-abandoned dry-cleaning store on a deserted high street or a legitimate address that’s also home to 1,847 other registered companies). A quick search on Google Maps will show whether the address does map
  • Run if you spot a ghost: The surest giveaway of a predatory author service is the offering of ghostwriting as a service. Ghost authorship, the act of someone else authoring your entire manuscript, is a violation of research integrity. And when even ghostwriting doesn’t suffice, these services are happy to plagiarize another author’s work and pass it off as the client’s own
  • Ask your peers: Before deciding to use a service, double-check any testimonials on the provider’s homepage or ask around in your peer network.

Taking on the predator – collectively and individually

The scientific predator will no doubt continue to evolve, getting more sophisticated with time. Ultimately, all anyone can do to eradicate predatory author services or journals is to increase awareness among authors and provide resources to help them identify such predators. Cabells, Cactus, and many other industry players continually work to provide this guidance, but a good deal of the burden of responsibility has to be shared by academic researchers themselves. As the Romans might have said, caveat scriptor – author beware!

For any authoring service ad or mail you come across, look it up. Search on the net, ask your fellow researchers, pose a query in a researcher forum, go through recommended journal indices of quality and predatory publications such as those of Cabells. If it’s genuine, it will show up in several searches – and you will live to publish another day.

For further help and support in choosing the right journal or author services, go to: www.cabells.com or www.editage.com.

Five dos and don’ts for avoiding predatory journals

HAVE YOUR SAY

Publication ethics is at the core of everything that Cabells does, and it continually promotes all scholarly communication bodies which seek to uphold the highest standard of publishing practices. As such, we would like to express our support for Simon Linacre (Cabells’ Director of International Marketing and Development) in his candidacy to become a COPE Trustee. COPE plays an essential role in ensuring scholarly publishing maintains the highest standards, and if you are a COPE member is it important you use your vote to support the organization’s progress.

Simon has been with Cabells two years, and involved in academic publishing for almost 20 years. In that time he has gained wide experience of all aspects of journal publishing, and in particular Open Access issues which this role focuses on.

If you would like to vote in the election, please go to the COPE website, log in and cast your vote for your favored candidate.

Thanks, The Cabells Team

It is three years since Cabells first launched its database on predatory journals, and a good deal has happened in that time in the world of scholarly publishing. In his latest post, Simon Linacre reflects on these changes and offers some ‘dos and don’ts’ on the latest version of the database.


In June 2017 – which seems a lifetime ago now for all sorts of reasons – Cabells launched a new database that included details on over 4,000 predatory journals. It was the first time that a resource of that size had been made available to researchers who wanted to check the legitimacy or otherwise of journals they may be considering as a destination for their articles. In the intervening years, it is to be hoped many authors have been alerted to the dangers of publishing their research in such journals and benefited from worthwhile publishing experiences in good journals.

At the time, Cabells chose to name the database the ‘Blacklist’ as the most straightforward description of the intent of the database. As some may have seen, we brought forward the decision to change its name to ‘Predatory Reports’ last week in the first of a number of changes Cabells intends to introduce in 2020 and beyond.

5.5 x 8.5 – PR - front

The new name includes the word ‘Reports’ for an important reason. The database has been designed as more than a simple list of predatory, fake or questionable journals – it has also been put together so that researchers can use the information that has been collated on all 13,400 journals to inform their understanding of scholarly communications, and as a result, make better decisions about their research publications and career into the future. In this spirit, here are FIVE DOS AND DON’TS of how to use the Cabells Predatory Reports database:

  1. DO check all violations listed for each journal on Predatory Reports to fully understand what the journal is NOT doing properly, as this can to help identify predatory behavior in future
  2. DON’T trust a journal because it has an ISSN on its website – over 40% of journals listed on Predatory Reports include one, with many copied from legitimate journals or simply invented
  3. DO check the publisher’s name in the ‘Advanced Search’ option if a journal is not included on the database, as the same publisher could have created a new journal with the same predatory behaviors
  4. DON’T visit a predatory journal website unnecessarily as they could contain malware – hover the cursor over the website to view the full URL to see if it corresponds to that of the journal being checked out
  5. DO send Cabells updates or information on potential new predatory journals by sending an email to ‘journals@cabells.com’

And as a final ‘DO’, do click the link to our 70+ criteria that we use to identify predatory journals – these will be updated soon to streamline and clarify the process of reviewing journals for inclusion in Predatory Reports, and offer a much more robust checklist than currently exists to help researchers avoid falling into the predatory journal trap.

Announcement regarding brand-wide language changes, effective immediately

Since late last year, Cabells has been working on developing new branding for our products that better embody our ideals of integrity and equality in academic publishing and society as a whole. We set out to ensure that the changes represent a total departure from the symbolism inextricably tied to the idea of blacklists and whitelists. In support of, and in solidarity with, the fight against systemic racism that our country is facing, Cabells is implementing brand-wide language changes, effective immediately. The changes implemented today represent only a fraction of those that we will be launching in 2020, but it is an important start.

Users may experience temporary outages as the changes roll out, but normal operations should resume quickly. Customer access will function identically as before the changes, but look for the term “Journalytics” in place of “whitelist” and “Predatory Reports” in place of “blacklist.”

Please contact Mike Bisaccio at michael.bisaccio@cabells.com with any questions or for additional information.

Cabells thanks the entire community for their support of this effort.

Sincerely,
The Cabells Team

Journalytics Academic | Know the Journals. Know the Risks.

Guide to Journalytics Academic | Find High-Quality Publishing Outlets

Predatory Reports | Know the Journals. Know the Risks


Guide to Predatory Reports | Know the Risks | Avoid Predatory Journals

Journalytics Medicine & Predatory Reports | Know the Journals. Know the Risks.

Guide to Journalytics Medicine & Predatory Reports | Find High-Quality Journals