Literature DB >> 32375818

Checklists to detect potential predatory biomedical journals: a systematic review.

Samantha Cukier1, Lucas Helal1, Danielle B Rice1, Justina Pupkaite2, Nadera Ahmadzai3, Mitchell Wilson1, Becky Skidmore1, Manoj M Lalu4, David Moher5.   

Abstract

BACKGROUND: The increase in the number of predatory journals puts scholarly communication at risk. In order to guard against publication in predatory journals, authors may use checklists to help detect predatory journals. We believe there are a large number of such checklists yet it is uncertain whether these checklists contain similar content. We conducted a systematic review to identify checklists that help to detect potential predatory journals and examined and compared their content and measurement properties.
METHODS: We searched MEDLINE, Embase, PsycINFO, ERIC, Web of Science and Library, and Information Science & Technology Abstracts (January 2012 to November 2018); university library websites (January 2019); and YouTube (January 2019). We identified sources with original checklists used to detect potential predatory journals published in English, French or Portuguese. Checklists were defined as having instructions in point form, bullet form, tabular format or listed items. We excluded checklists or guidance on recognizing "legitimate" or "trustworthy" journals. To assess risk of bias, we adapted five questions from A Checklist for Checklists tool a priori as no formal assessment tool exists for the type of review conducted.
RESULTS: Of 1528 records screened, 93 met our inclusion criteria. The majority of included checklists to identify predatory journals were in English (n = 90, 97%), could be completed in fewer than five minutes (n = 68, 73%), included a mean of 11 items (range = 3 to 64) which were not weighted (n = 91, 98%), did not include qualitative guidance (n = 78, 84%), or quantitative guidance (n = 91, 98%), were not evidence-based (n = 90, 97%) and covered a mean of four of six thematic categories. Only three met our criteria for being evidence-based, i.e. scored three or more "yes" answers (low risk of bias) on the risk of bias tool.
CONCLUSION: There is a plethora of published checklists that may overwhelm authors looking to efficiently guard against publishing in predatory journals. The continued development of such checklists may be confusing and of limited benefit. The similarity in checklists could lead to the creation of one evidence-based tool serving authors from all disciplines.

Entities:  

Keywords:  Predatory journals; Predatory publishing; Scholarly communication; Systematic review

Mesh:

Year:  2020        PMID: 32375818      PMCID: PMC7203891          DOI: 10.1186/s12916-020-01566-1

Source DB:  PubMed          Journal:  BMC Med        ISSN: 1741-7015            Impact factor:   8.775


Background

The influx of predatory publishing along with the substantial increase in the number of predatory journals pose a risk to scholarly communication [1, 2]. Predatory journals often lack an appropriate peer-review process and frequently are not indexed [3], yet authors are required to pay an article processing charge. The lack of quality control, the inability to effectively disseminate research and the lack of transparency compromise the trustworthiness of articles published in these journals. Until recently, no agreed-upon definition of predatory journals existed. However, through a consensus process [4], an international group of researchers, journal editors, funders, policy makers, representatives of academic institutions, and patient partners, developed a definition of predatory journals and publishers. The group recognized that identifying predatory journals and publishers was nuanced; not all predatory journals meet all ‘predatory criteria’ nor do they meet each criterion at the same level. Thus, in defining predatory journals and publishers, the group identified four main characteristics that could characterize journals or publishers as predatory: “Predatory journals and publishers are entities that prioritize self-interest at the expense of scholarship and are characterized by false or misleading information, deviation from best editorial/publication practices, lack of transparency, and/or use of aggressive and indiscriminate solicitation practices.” [4]. Lists of suspected predatory journals and publishers are also available, although different criteria for inclusion are used [5]. Various groups have developed checklists to help prospective authors and/or editors identify potential predatory journals; these are different from efforts, such as “Think. Check. Submit.” to identify legitimate journals. Anecdotally, we have recently noticed a steep rise in the number of checklists developed specifically to identify predatory journals, although to our knowledge this has not been quantified previously. Further, we are unaware of any research looking at the uptake of these checklists. On the one hand, the development of these checklists – practical tools to help detect potential predatory journals – may lead to a substantial decrease in submissions to these journals. On the other hand, large numbers of checklists with varying content may confuse authors, and possibly make it more difficult for them to choose any one checklist, if any at all, as suggested by the choice overload hypothesis [6]. That is, the abundance of conflicting information could result in users not consulting any checklists. Additionally, the discrepancies between checklists could impact the credibility of each one. Thus, these efforts to reduce the number of submissions to predatory journals will be lost. Therefore, we performed a systematic review of peer reviewed and gray literature that include checklists to help detect potential predatory journals in order to identify the number of published checklists and to examine and compare their content and measurement properties.

Methods

We followed standard procedures for systematic reviews and reported results according to Preferred Reporting Items for Systematic reviews and Meta-Analyses (PRISMA) guidelines [7]. The project protocol was publicly posted prior to data extraction on the Open Science Framework (http://osf.io/g57tf).

Data sources and searches

An experienced medical information specialist (BS) developed and tested the search strategy using an iterative process in consultation with the review team. The strategy was peer reviewed by another senior information specialist prior to execution using the Peer Review of Electronic Search Strategies (PRESS) Checklist [8] (see Additional file 1). We searched multiple databases with no language restrictions. Using the OVID platform, we searched Ovid MEDLINE® ALL (including in-process and epub-ahead-of-print records), Embase Classic + Embase, PsycINFO and ERIC. We also searched Web of Science and the Library, Information Science and Technology Abstracts (LISTA) database (Ebsco platform). The LISTA search was performed on November 16, 2018 and the Ovid and Web of Science searches were performed on November 19, 2018. Retrieval was limited to the publication dates 2012 to the present. We used 2012 as a cut-off since data about predatory journals were first collected in 2010, [9] and became part of public discourse in 2012 [10]. The search strategy for the Ovid databases is included in Additional file 2. In order to be extensive in our search for checklists that identify potential predatory journals, we identified and then searched two relevant sources of gray literature, based on our shared experiences in this field of research: university library websites and YouTube. Neither search was restricted by language. We used the Shanghai Academic Rankings of World Universities (http://www.shanghairanking.com/ARWU-Statistics-2018.html) to identify university library websites of the top 10 universities in each of the four world regions (Americas, Europe, Asia / Oceania, Africa). We chose this website because it easily split the world into four regions and we saw this as an equitable way to identify institutions and their libraries. As our author group is based in Canada, we wanted to highlight the universities in our region and therefore identified the library websites of Canada’s most research-intensive universities (U15) (search date January 18, 2019) and searched their library websites. We also searched YouTube for videos that contained checklists (search date January 6, 2019). We limited our YouTube search to the top 50 results filtered by “relevance” and used a private browser window. Detailed methods of these searches are available on the Open Science Framework (http://osf.io/g57tf).

Eligibility criteria

Inclusion criteria

Our search for studies was not restricted by language, however, for reasons of feasibility, we included studies and/or original checklists developed or published in English, French or Portuguese (languages spoken by our research team). We defined checklist as a tool whose purpose is to detect a potential predatory journal and the instructions are in point form / bullet form / tabular format / listed items. To qualify as an original checklist, the items had to have been identified and/or developed by the study authors or include a novel combination of items from multiple sources, or an adaptation of another checklist plus items added by the study authors. We included studies that discussed the development of an original checklist. When a study referenced a checklist, but did not describe the development of the checklist, we searched for the paper that discussed the development of the original checklist and included that paper.

Exclusion criteria

Checklists were not considered original if items were hand-picked from one other source; for example, if authors identified the five most salient points from an already existing checklist. We did not include lists or guidance on recognizing a “legitimate” or “trustworthy” journal. We stipulated this exclusion criterion since our focus was on tools that specifically identify predatory journals, not tools that help to recognize legitimate journals.

Study selection

Following de-duplication of the identified titles, we screened records using the online systematic review software program Distiller Systematic Review (DSR) (Evidence Partners Inc., Ottawa, Canada). For each stage of screening, data extraction and risk of bias assessment, we pilot tested a 10% sample of records among five to six reviewers. Screening was performed in two stages: Stage 1: title and abstract; Stage 2: full-text screening (see Fig. 1). Both stages were completed by two reviewers independently and in duplicate. At both stages, discrepancies were resolved either through consensus or third party adjudication.
Fig. 1

PRISMA flow diagram

PRISMA flow diagram

Data extraction and risk of Bias assessment

For each eligible study, two reviewers independently extracted relevant data into DSR and a third reviewer resolved any conflicts. The extracted data items were as follows: 1- checklist name, 2- number of items in the checklist, 3- whether the items were weighted, 4- the number of thematic categories covered by the checklist (six-item list developed by Cobey et al. [3]), 5- publication details (name of publication, author and date of publication), 6- approximate time to complete checklist (reviewers used a timer to emulate the process that a user would go through to use the checklist and recorded the time as 0–5 min, 6–10 min, or more than 10 min), 7- language of the checklist, 8- whether the checklist was translated and into what language(s), 9- methods used to develop the checklist (details on data collection, if any), 10- whether there was qualitative guidance (instructions on how to use the checklist and what to do with the results) and/or 11- quantitative guidance (instructions on summing the results or quantitatively assessing the results to inform a decision). The list of extracted data items can be found on the Open Science Framework (https://osf.io/na756/). In assessing checklists identified via YouTube, we extracted only data items that were presented visually. Any item or explanation that was delivered by audio only was not included in our assessment. We used the visual presentation of the item to be a sign that the item was formally included in the checklist. For example, if presenters only talked about a checklist item but did not have it on a slide in the video or in a format that could be seen by those watching the video, we did not extract this data. To assess risk of bias, we developed an a priori list of five questions for the purpose of this review, adapted from A Checklist for Checklists tool [11], and principles of internal and external validity [12]. The creation of a novel tool to assess risk of bias was necessary since there is no appropriate formal assessment tool that exists for the type of review we conducted. Our author group looked over the three areas identified in the Checklist for Checklists tool (Development, Drafting and Validation). Based on extensive experience working with reporting guidelines (DM), which are checklists, we chose a feasible number of items from each of the three categories to be used in our novel tool. We pilot tested the items among our author group to ensure that all categories were captured adequately, and that the tool could be used feasibly. We used the results of this assessment to determine whether the checklist was evidence-based. We assigned each of the five criteria (listed below) a judgment of “yes” (i.e. low risk of bias), “no” (i.e. high risk of bias) or “cannot tell” (i.e. unclear risk of bias) (see coding manual with instructions for assessment to determine risk of bias ratings: https://osf.io/sp4vx/). If the checklist scored three or more “yes” answers on the questions below, assigning the checklist an overall low risk of bias, we considered it evidence-based. We made this determination based on the notion that a low risk of bias indicates that there is a low risk of systematic error across results. Two reviewers independently assessed data quality in DSR and discrepancies were resolved through discussion. A third reviewer was called to resolve any remaining conflicts. The five criteria, adapted from the Checklist for Checklists tool [11], used to assess risk of bias in this review were as follows: 1. Did the developers of the checklist represent more than one stakeholder group (e.g. researchers, academic librarians, publishers)? 2. Did the developers report gathering any data for the creation of the checklist (i.e. conduct a study on potential predatory journals, carry out a systematic review, collect anecdotal data)? 3. Does the checklist meet at least one of the following criteria: 1- Has title that reflects its objectives; 2- Fits on one page; 3- Each item on the checklist is one sentence? 4. Was the checklist pilot-tested or trialed with front-line users (e.g. researchers, students, academic librarians)? 5. Did the authors report how many criteria in the checklist a journal must meet in order to be considered predatory? In assessing websites, we used a “two-click rule” to locate information. Once on the checklist website, if we did not find the information within two mouse clicks, we concluded no information was available.

Data synthesis and analysis

We examined the checklists qualitatively and conducted qualitative comparisons of the items. We compared the items in the included checklists to gauge their agreement on content by item and overall. We summarized the checklists in table format to facilitate inspection and discussion of findings. Frequencies and percentages were used to present characteristics of the checklists. We used the list developed by Shamseer et al. [13] as the reference checklist and compared our results to this list. We chose this as the reference list because of the rigorous empirical data generated by authors to ascertain characteristics of potential predatory journals.

Results

Deviations from our protocol

We refined our definition of an original checklist to exclude checklists that were comprised of items taken solely from another checklist. Checklists made up of items taken from more than one source were considered original even when the developers did not create the checklist items themselves. For reasons of feasibility, we did not search the reference lists in these checklists to identify further potentially relevant studies. To screen the titles and abstracts, we had anticipated using the liberal accelerated method where only one reviewer is required to include citations for further assessment at full-text screening and two reviewers are needed to exclude a citation [14]. Instead, we used the traditional screening approach: we had two reviewers screen records independently and in duplicate. We changed our screening methods because it became feasible to use the traditional screening approach, which also reduced the required number of full-text articles to be ordered. After completing data collection, we recognized that checklists were being published in discipline-specific journals, within biomedicine. We wanted to determine what disciplines were represented and in what proportion. We conducted a scan of the journals and used an evolving list of disciplines to assign to the list of journals, i.e. we added disciplines to the evolving list as we came across them. Following the screening of 1529 records, we identified 93 original checklists to be included in our study (see full details in Fig. 1).

Checklist characteristics

We identified 53 checklists identified through our search of electronic databases. The numbers of checklists identified increased over time: one each in 2012 [10], 2013 [15], rising to 16 in 2017 [13, 16–30] and 12 in 2018 [31-42]. We identified 30 original checklists [1, 43–71] from university library websites. More checklists were published in more recent years (2017 = 4 [45-48]; 2018 = 7 [49-55]; 2019 = 11 [56-66]; five checklists listed no publication date). We identified 10 more checklists from YouTube [72-81] that included one uploaded in 2015 [72], six in 2017 [73-78] and three in 2018 [79-81]. See Table 1 for full checklist characteristics.
Table 1

Characteristics of checklists (oldest to most recently published)

StudyChecklist nameNumber of itemsItems weighted Y/NTime to complete minMethods used to develop checklist(NR = Not reported)Qualitative guidance Y/NQuantitative guidance Y/N
Checklists from electronic journal databases n = 53
[REFERENCE CHECKLIST] Shamseer, 2017 [13]Salient characteristics of potential predatory journals13N0–5Cross-sectional analysis 93 predatory journals, 99 OA, 100 subscription based journals assessedNoNo
Beall, 2012 [10]Criteria for Determining Predatory Open-Access Publishers5No10+ minCriteria based on AOSPAa Code of Conduct, two COPEb publicationsNoNo
Beall, 2013 [15]Some warning signs of questionable publishers7No0–5 minObservational research on own emails receivedNoNo
Crawford, 2014 [82]No title11No6–10 minAssessed all criteria in Beall’s criteriaYesNo
Knoll, 2014 [83]Avoiding Predatory OA Journals17No0–5 minWorks citedNoNo
Lukic, 2014 [84]No title13No0–5 minMultiple referencesNoNo
Beall, 2015 [85]Criteria for Determining Predatory Open-Access Publishers5No10+ minCriteria based on COPE documents: Code of Conduct and Principles of Transparency and Best Practices in Scholarly PublicationNoNo
Bhad, 2015 [86]How should one suspect a journal could be a predatory journal?9No0–5 minNRNoNo
Bradley-Springer, 2015 [87]No title6No0–5 minMultiple referencesNoNo
Hemmat Esfe, 2015 [88]Features of the Fake Journals9No0–5 minNRNoNo
INANE Predatory Publishing Practices Collaborative, 2015 [89]Guidelines for evaluating the integrity of a journal / Red flags7No0–5 minLimited literature reviewNoNo
Pamukcu Gunaydin, 2015 [90]No title10No0–5 minAuthors’ top 10 based on other referencesNoNo
Proehl, 2015 [91]Guidelines for evaluating the integrity of a journal - Red flags7No0–5 minReferences to other checklists: COPE etc.NoNo
Stone, 2015 [92]Guidelines for evaluating the integrity of a journal6No0–5 minOther credible resources: COPE, INANEc, otherNoNo
Yucha, 2015 [93]Guidelines for Evaluating the Integrity of a Journal - Red flags10No0–5 minMultiple references to other checklistsNoNo
Cariappa, 2016 [94]Telltale signs - Something is wrong!7No0–5Some literature citedNoNo
Carroll, 2016 [95]Common Practices of Predatory Open Access Publications4No0–5 minLimited literature reviewNoNo
Dadkhah, 2016 [96]Criteria to rank predatory journals14Yes10+ minObservational study of 150 journals 80 predatory, 70 nonYesYes
Fraser, 2016 [97]Red Flags for Recognizing Predatory Journals6No0–5 minTwo citationsNoNo
Glick, 2016 [98]What you can expect from a predatory publisher7No0–5 minMultiple citationsNoNo
Glick, 2016a [99]Clues suggesting a “predatory” journal11No0–5 minMultiple referencesNoNo
Hansoti, 2016 [100]Overall Approach to Choosing the Journal11No0–5 minExtensive literature reviewYesNo
Morley, 2016 [101]10 steps to spot a predatory publisher10No0–5 minA few citationsYesNo
Nolan, 2016 [102]None section title exists but not title5No0–5 minNone notedNoNo
Ward, 2016 [103]No title8No6–10 minNone listedNoNo
Abadi, 2017 [16]No title26No6–10 minNRNoNo
Balehegn, 2017 [30]No title5No0–5 minReferencesNoNo
Berger, 2017 [17]Detailed Characteristics of Predatory Journals15No6–10 minNRNoNo
Das, 2017 [18]How to identify predators?15No0–5 minTwo citationsNoNo
Erfanmanesh 2017 [19]No title18No6–10 minMultiple referencesNoNo
Eriksson, 2017 [20]Characteristics of a predatory journal25No10+Limited literature reviewNoNo
Janodia, 2017 [21]No title9No6–10 minNRYesNo
Khan, 2017 [22]Attributes, characteristics and practices of potential predatory journals9No0–5 minCitationsNoNo
Klyce, 2017 [23]Common characteristics of predatory journals13No0–5 minLimited literature reviewNoNo
Manca, 2017 [24]No title6No0–5 minLimited literature reviewNoNo
Miller, 2017 [25]Signs of a Predatory Publisher8No0–5 minNRNoNo
Misra, 2017 [26]Red flags based on which one may suspect the legitimacy of a journal17No0–5 minLiterature review and authors’ experiencesNoNo
Mouton, 2017 [27]Comparing the characteristics of good practice in scholarly publishing with those of predatory publishing7No0–5 minIn-depth assessment of journals identified by Beall’s list where South African authors publishedNoNo
Oren, 2017 [28]Obvious signs of predatory journals7No0–5 minNRNoNo
Shamseer, 2017 [13]Salient characteristics of potential predatory journals13No0–5 minObservational study 93 predatory journals, 99 OAd, 100 subscription-based journals assessedNoNo
Stratton, 2017 [29]Characteristics of Health and Medical Journal Publishing Formats - Open Access Predatory4No0–5 minCited referencesNoNo
Ajuwon, 2018 [33]Characteristics of Predatory Publishers and Journals12No0–5 minCitations from other sourcesNoNo
Bowman, 2018 [34]Identifying Predatory Journals and Publishers29No6–10 minNRNoNo
Gerberi, 2018 [35]Quick List of Predatory Publisher Warning Signs7No0–5 minLimited literature reviewNoNo
Kokol, 2018 [37]No title4No0–5Analysis of papers 2013–2017 predatory Beall’s vs nonNoNo
Lewinski, 2018 [38]Eight tips to identify a predatory journal or publisher8No0–5NRYesNo
McCann, 2018 [31]Guidelines for authors to avoid predatory publishers25No0–5 minBrief literature reviewYesNo
Memon, 2018 [39]No title14No6–10 minCollecting emails and web pages of each journal /publisher. Used Beall’s list, PubMed, DOAJe, Thomson and Reuters now Clarivate AnalyticsNoNo
Nnaji, 2018 [40]No title11No6–10 minTwo referencesNoNo
O’Donnell, 2018 [32]Identifying a predator17No6–10 minOther evidence-based checklistNoNo
Pamukcu Gunaydin, 2018 [36]How to avoid sending your work to a predatory journal5No0–5Limited literature reviewNoNo
Power, 2018 [41]No title11No6–10 minReferences COPE, INANENoNo
Richtig, 2018 [42]Criteria identified or suggested in the literature that can potentially be used to identify predatory journals13No6–10Literature reviewNoNo
Wikipedia, 2019 [104]No title10No0–5 minMultiple citationsNoNo
Checklists from university library websites n = 30
[REFERENCE CHECKLIST] Shamseer, 2017 [13]Salient characteristics of potential predatory journals13N0–5Cross sectional analysis 93 predatory journals, 99 OA, 100 subscription-based journals assessedNoNo
Carlson, 2014 [43]None11No0–5 minBased on personal experiences looking into questionable OA journalsNoNo
Clark, 2015 [1]None5No0–5 minNRNoNo
University of Edinburgh, 2015 [44]Some warning signs to look out for4No0–5 minNRNoNo
Africa Check, 2017 [45]None7No0–5 minNRNoNo
Cabell’s – Clarivate, 2017 [46]None64No10+ minAnalysis by specialists see: https://www2.cabells.com/about-blacklistNoNo
Duke University Medical Center, 2017 [47]Be iNFORMEd Checklist6No0–5 minNRYesNo
University of Calgary, 2017 [48]No title6No0–5 minNRNoNo
Coopérer en information scientifique et technique, 2018 [49]No title35No10+ minNRNoNo
Eaton, University of Calgary 2018 [50]No title12No0–5 minOther sources citedYesNo
Lapinksi, Harvard, 2018 [51]No title3No0–5 minNRYesNo
Sorbonne Université, 2018 [52]Comment repérer un éditeur prédateur12No0–5 minNRNoNo
University of Alberta, 2018 [54]No title19No6–10 minNRNoNo
University of British Columbia, 2018 [53]No title16No6–10 minNRNoNo
University of Toronto Libraries, 2018 [55]Identifying Deceptive Publishers: A Checklist22Yes6–10 minNRYesYes
Dalhousie University, 2019 [56]How to recognize predatory journals6No0–5 minNRNoNo
McGill University, 2019 [57]No title4No0–5 minNRNoNo
McMaster University, 2019 [58]No title6No0–5 minNRNoNo
Prater, American Journal Experts, 2019 [59]No title8No0–5 minNRYesNo
Ryerson University Library, 2019 [60]No title5No0–5 minNRNoNo
Université Laval, 2019 [61]No title8No0–5 minNRNoNo
University of Cambridge, 2019 [62]No title9No6–10 minNRNoNo
University of Pretoria, 2019 [63]No title19No0–5 minNRNoNo
University of Queensland Library, 2019 [65]No title6No0–5 minNRNoNo
University of Queensland Library, 2019a [64]Red Flags for Open Access Journals9No0–5 minNRYesNo
University of Witwatersrand, 2019 [66]Predatory Publisher Checklist26No10+ minNRNoNo
Canadian Association of Research Libraries, ND [67]How to assess a journal A.K.A. How not to publish in an undesirable journal12No0–5 minNRYesNo
Columbia University Libraries, ND [68]No title5No0–5 minNRYesNo
Technion Library, ND [69]No title10No0–5 minNRNoNo
UC Berkley, ND [70]No title3No0–5 minCited 1 paperNoNo
University of Ottawa Scholarly Communication, ND [71]No title12No0–5 minNRNoNo
Checklists from YouTube n = 10
[REFERENCE CHECKLIST] Shamseer, 2017 [13]Salient characteristics of potential predatory journals13N0–5Cross sectional analysis 93 predatory journals, 99 OA, 100 subscription-based journals assessedNoNo
Robbins, Western Sydney University, 2015 [72]Red Flags9No6–10 minNRNoNo
Attia, 2017 [73]Spot Predatory Publishers4No0–5 minNRNoNo
Kysh, USC Keck School of Medicine, 2017 [74]Characteristics of Predatory Publishers9No0–5 minNRNoNo
McKenna, Rhodes University, 2017 [75]Predatory Publications: Shark Spotting7No0–5 minNRNoNo
Nicholson, University of Witwatersrand, 2017 [76]Cautionary Checklist36No10+ minNRNoNo
Raszewski, 2017 [77]What to watch out for4No0–5 minNRNoNo
Seal-Roberts, Springer Healthcare, 2017 [78]So how do we recognize a predatory publisher?10No0–5 minNRNoNo
Menon, SCMS Group of Educational Institutions, India and Berryman, Cabell’s, 2018 [79]Characteristics of Predatory Journals7No0–5 minNRNoNo
Office of Scholarly Communication, Cambridge University, 2018 [80]None12No0–5 minNRNoNo
Weigand, UNC Libraries, 2018 [81]No title5No0–5 minNRNoNo

aOASPA Open Access Scholarly Publishers Association

bCOPE Committee on Publication Ethics

cINANE International Academy of Nursing Editors

dOA Open Access

eDOAJ Directory of Open Access Journals

Characteristics of checklists (oldest to most recently published) aOASPA Open Access Scholarly Publishers Association bCOPE Committee on Publication Ethics cINANE International Academy of Nursing Editors dOA Open Access eDOAJ Directory of Open Access Journals

Language and translation

Almost all checklists were published in English (n = 90, 97%), and the remaining checklists in French (n = 3, 3%) [49, 52, 61]. Two additional English checklists identified through university library websites were translated into French [55, 67] and one was translated into Hebrew [69].

Approximate time for user to complete checklist, number of items per checklist and weighting

Most checklists could be completed within five minutes (n = 68, 73%); 17 checklists (18%) could be completed in six to 10 min [16, 17, 19, 21, 32, 34, 39–42, 53–55, 62, 72, 82, 103] and eight checklists (9%) took more than 10 min to complete [10, 20, 46, 49, 66, 76, 85, 96]. Checklists contained a mean of 11 items each, and a range of between three and 64 items. Items were weighted in two checklists [55, 96].

Qualitative and quantitative guidance

Qualitative guidance on how to use the results of checklists was provided on 15 checklists (16%) [21, 31, 38, 47, 50, 51, 55, 59, 65, 67, 68, 82, 96, 100, 101], and quantitative guidance was provided on two checklists [55, 96], i.e. prescribing a set number of criteria that would identify the journal or publisher as predatory.

Methods used to develop checklists

In order to develop the checklists, authors noted using analysis by specialists [46], information from already existing checklists [85, 91, 93], using existing literature on predatory journals to pick the most salient features to create a new checklist [31, 42, 100], developing checklists after empirical study [13, 27, 39, 96] or after personal experiences [15].

Risk of bias assessment

Among all 93 checklists, there were three (3%) assessed as evidence-based [27, 96, 100] (see Table 2 for detailed risk of bias assessment results including whether a checklist was determined to be evidence-based, i.e. rated as low risk of bias for at least three of the criteria).
Table 2

Risk of bias assessment. Three ‘Yes’ assessments results in an overall assessment of evidence-based

StudyRepresent 1+ stakeholder groups (Y/N/U)*Gather data for checklist development (Y/N/Only citations/ U)Does the checklist meet at least one of these criteria?(count only the last column in total)Pilot test(Y/N/U)Includes number of criteria to be considered predatory (Y/N)Overall assessment (is it evidence-based?) (Y/N)
Title (Y/N)Fits on one page (Y/N)Each item one sentence (Y/N)Meets at least one of these (Y/N)
Checklists from electronic journal databases (n = 53)
[REFERENCE CHECKLIST] Shamseer, 2017 [13]UYYYYYUNN
Beall, 2012 [10]UUNNNNUNN
Beall, 2013 [15]UYYYYYUNN
Crawford, 2014 [82]UUNYYYUNN
Knoll, 2014 [83]UOnly citationsYYNYUNN
Lukic, 2014 [84]UOnly citationsNYNYUNN
Beall, 2015 [85]UOnly citationsNNNNUNN
Bhad, 2015 [86]UUYYYYUNN
Bradley-Springer, 2015 [87]UOnly citationsNYNYUNN
Hemmat Esfe, 2015 [88]UOnly citationsNYYYUNN
INANE Predatory Publishing Practices Collaborative, 2015 [89]UUYYYYUNN
Pamukcu Gunaydin, 2015 [90]UOnly citationsNYNYUNN
Proehl, 2015 [91]UUNYYYUNN
Stone, 2015 [92]UOnly citationsNYYYUNN
Yucha, 2015 [93]UUYYYYUNN
Cariappa, 2016 [94]UOnly citationsYYYYUNN
Carroll, 2016 [95]UYYYYYUNN
Dadkhah, 2016 [96]UYYYYYYYY
Fraser, 2015 [97]UOnly citationsYYYYUNN
Glick, 2016 [98]UUYYYYUNN
Glick, 2016a [99]UOnly citationsYYYYUNN
Hansoti, 2016 [100]YYNYNYUNY
Morley, 2016 [101]UUNNNNUNN
Nolan, 2016 [102]UUNYNYUNN
Ward, 2016 [103]UUNNNNUNN
Abadi, 2017 [16]UUNYYYUNN
Balehegn, 2017 [30]UOnly citationsNYYYUNN
Berger, 2017 [17]UUNYNYUNN
Das, 2017 [18]UOnly citationsYYYYUNN
Erfanmanesh, 2017 [19]UOnly citationsYYYYUNN
Eriksson, 2017 [20]UUYYYYUNN
Janodia, 2017 [21]UUYYNYUNN
Khan, 2017 [22]YOnly citationsYYNYUNN
Klyce, 2017 [23]UOnly citationsYYYYUNN
Manca, 2017 [24]UOnly citationsNYNYUNN
Miller, 2017 [25]UUYYYYUNN
Misra, 2017 [26]UYYYYYUNN
Mouton, 2017 [27]UUYYNYYYY
Oren, 2017 [28]UUYYYYUNN
Shamseer, 2017 [13]UYYYYYUNN
Stratton, 2017 [29]UUYYYYUNN
Ajuwon, 2018 [33]UOnly citationsYYYYUNN
Bowman, 2018 [34]UUYYNYUNN
Gerberi, 2018 [35]UOnly citationsYYNYUNN
Pamukcu Gunaydin, 2018 [36]UOnly citationsYYNYUNN
Kokol, 2018 [37]UYNYYYUNN
Lewinski, 2018 [38]UYYYNYUNN
McCann, 2018 [31]UYYYYYUNN
Memon, 2018 [39]UYYYYYUNN
Nnaji, 2018 [40]UOnly citationsYNNYUNN
O’Donnell, 2018 [32]UOnly citationsNNNNUNN
Power, 2018 [41]UOnly citationsNNNNUNN
Richtig, 2018 [42]UOnly citationsYYNYUNN
Wikipedia, 2019 [104]UOnly citationsNYYYUNN
Checklists from university library websites (n = 30)
[REFERENCE CHECKLIST] Shamseer, 2017 [13]UYYYYYUNN
Carlson, 2014 [43]UYNNNNUNN
Clark, 2015 [1]UUNYNYUNN
University of Edinburgh, 2015 [44]UUYYYYUNN
Africa Check, 2017 [45]UOnly citationsNYNYUNN
Cabell’s - Clarivate, 2017 [46]YUNNYYUNN
Duke University Medical Center, 2017 [47]UUNNNNUNN
University of Calgary, 2017 [48]UUNYNYUNN
Coopérer en information scientifique et technique, 2018 [49]UUNNYYUNN
Eaton, University of Calgary, 2018 [50]UYNYYYUNN
Lapinski, Harvard University, 2018 [51]UUNNNNUNN
Sorbonne Université, 2018 [52]UUYYYYUNN
University of Alberta, 2018 [54]UUNNNNUNN
University of British Columbia, 2018 [53]UOnly citationsYNNYUNN
University of Toronto Libraries, 2018 [55]YUNNNNUYN
Dalhousie University, 2019 [56]UUYYNYUNN
McGill University, 2019 [57]UUNYNYUNN
McMaster University, 2019 [58]UUNYYYUNN
Prater - American Journal Experts, 2019 [59]YUNNNNUNN
Ryerson University Library, 2019 [60]UUNYNYUNN
Université Laval, 2019 [61]UUNYNYUNN
University of Cambridge, 2019 [62]UUNYNYUNN
University of Pretoria, 2019 [63]UUNYYYUNN
University of Queensland Library, 2019 [65]UUNYNYUNN
University of Queensland Library, 2019a [64]UUNYNYUNN
University of Witwatersrand, 2019 [66]UUYNYYUNN
Canadian Association of Research Libraries, ND [67]UUNYYYUNN
Columbia University Libraries, ND [68]UUNYNYUNN
Technion Library, ND [69]UUNYNYUNN
UC Berkeley, ND [70]UUNYNYUNN
University of Ottawa Scholarly Communication, ND [71]UUNYNYUNN
Checklists from YouTube (n = 10) [URLs available athttps://osf.io/eds9f/]
[REFERENCE CHECKLIST] Shamseer, 2017 [13]UY (cross-sectional study on journals)YYYYUNN
Robbins S, Western Sydney University, 2015 [72]UUYYYYUNN
Attia, 2017 [73]UUYNNYUNN
Kysh, USC Keck School of Medicine, 2017 [74]UUYYYYUNN
McKenna, Rhodes University, 2017 [75]UUYYYYUNN
Nicholson, University of Witwatersrand, 2017 [76]UUYNNYUNN
Raszewski, 2017 [77]UUYYNYUNN
Seal-Roberts, Springer Healthcare, 2017 [78]UUYYYYUNN
Menon, SCMS Group of Educational Institutions, India and Berryman, Cabells, 2018 [79]UUYYYYUNN
Office of Scholarly Communication, Cambridge University, 2018 [80]UUNNYYUNN
Weigand, UNC Libraries, 2018 [81]UUNYYYUNN

Y Yes, N No, U Unclear

Risk of bias assessment. Three ‘Yes’ assessments results in an overall assessment of evidence-based Y Yes, N No, U Unclear

Results for risk of bias criteria

Criterion #1: representation of more than one stakeholder group in checklist development For the majority of checklists (n = 88, 94%), it was unclear whether there was representation of more than one stakeholder group in the checklist development process (unclear risk of bias). The remaining five checklists reported the inclusion of more than one stakeholder group (low risk of bias) [22, 46, 55, 59, 100]. Criterion #2: authors reported gathering data to inform checklist development In most studies (n = 55, 59%) there was no mention of data gathering for checklist development (unclear risk of bias); in 26 cases (28%), one or two citations were noted next to checklist items, with no other explanation of item development or relevance (high risk of bias) [18, 19, 22–24, 30, 32, 33, 35, 36, 40–42, 45, 53, 83–85, 87, 88, 90, 92, 94, 97, 99, 104]. Twelve records (13%) included a description of authors gathering data to develop a checklist for this criterion (low risk of bias) [13, 15, 26, 31, 37–39, 43, 50, 95, 96, 100]. Criterion #3: at least one of the following: title that reflected checklist objective; checklist fits on one page; items were one sentence long Most checklists were assessed as low risk of bias on this criterion, with 81 of the checklists (87%) meeting at least one of the noted criteria (relevant title, fits on one page, items one sentence long). Criterion #4: authors reported pilot testing the checklist In the majority of studies (n = 91, 98%), authors did not report pilot testing during the checklist development stages (unclear risk of bias). Criterion #5: checklist instructions included a threshold number of criteria to be met in order to be considered predatory The majority of studies (n = 90, 97%), did not include a threshold number of criteria to be met in order for the journal or publisher to be considered predatory (high risk of bias).

Assessment of the thematic content of the included checklists

We found checklists covered the six thematic categories, as identified by Cobey et al., [3] as follows (see Table 3 for thematic categories and descriptions of categories):
Table 3

Thematic categories covered by the checklists (oldest to most recently published)

StudyCategories covered by checklist*
Journal OperationsArticleEditorial and Peer ReviewCommunicationsArticle Processing ChargeDissemination, indexing + archiving
Checklists from electronic journal databases n = 53
 [REFERENCE CHECKLIST] Shamseer, 2017 [13]XXXXXX
 Beall, 2012 [10]XXXXXX
 Beall, 2013 [15]XXXX
 Crawford, 2014 [82]XXX
 Knoll, 2014 [83]XXXXX
 Lukic, 2014 [84]XXXX
 Beall, 2015 [85]XXXXXX
 Bhad, 2015 [86]XXXX
 Bradley-Springer, 2015 [87]XX
 Hemmat Esfe, 2015 [88]XXXX
 INANE Predatory Publishing Practices Collaborative, 2015 [89]XXX
 Pamukcu Gunaydin, 2015 [90]XXXX
 Proehl, 2015 [91]XXX
 Stone, 2015 [92]XXX
 Yucha, 2015 [93]XXX
 Cariappa, 2016 [94]XXXX
 Carroll, 2016 [95]XXX
 Dadkhah, 2016 [96]XXXXX
 Fraser, 2016 [97]XXXX
 Glick, 2016 [98]XXXX
 Glick, 2016a [99]XXXX
 Hansoti, 2016 [100]XXXX
 Morley, 2016 [101]XXXX
 Nolan, 2016 [102]XXXX
 Ward, 2016 [103]XXXXXX
 Abadi, 2017 [16]XXXXXX
 Balehegn, 2017 [30]X
 Berger, 2017 [17]XXXXX
 Das, 2017 [18]XXXX
 Erfanmanesh, 2017 [19]XXXXXX
 Eriksson, 2017 [20]XXXXXX
 Janodia, 2017 [21]XXX
 Khan, 2017 [22]XXXXX
 Klyce, 2017 [23]XXXXX
 Manca, 2017 [24]XXXX
 Miller, 2017 [25]XXX
 Misra, 2017 [26]XXXXXX
 Mouton, 2017 [27]XXXX
 Oren, 2017 [28]XXXX
 Shamseer, 2017 [13]XXXXXX
 Stratton, 2017 [29]XXXX
 Ajuwon, 2018 [33]XXXXX
 Bowman, 2018 [34]XXXXXX
 Gerberi, 2018 [35]XXX
 Kokol, 2018 [37]X
 Lewinski, 2018 [38]XXXX
 McCann, 2018 [31]XXXXX
 Memon, 2018 [39]XXXX
 Nnaji, 2018 [40]XXXXXX
 O’Donnell, 2018 [32]XXXXXX
 Pamukcu Gunyadin, 2018 [36]XXXX
 Power, 2018 [41]XXXXX
 Richtig, 2018 [42]XXXXXX
 Wikipedia 2019 [104]XXXX
TOTALS /53 checklists from electronic journal databases (n, %)47, 8924, 4545, 8542, 7934, 6434, 64
Checklists from university library websites n = 30
 Carlson, 2014 [43]XX
 Clark, 2015 [1]XXXX
 University of Edinburgh, 2015 [44]XX
 Africa Check, 2017 [45]XXX
 Cabell’s – Clarivate, 2017 [46]XXXXXX
 Duke University Medical Center, 2017 [47]XXXXX
 University of Calgary, 2017 [48]XXXX
 Coopérer en information scientifique et technique, 2018 [49]XXXXX
 Eaton, University of Calgary, 2018 [50]XXXX
 Lapinksi, Harvard, 2018 [51]X
 Sorbonne Université, 2018 [52]XXXXX
 University of Alberta, 2018 [54]XXXX
 University of British Columbia, 2018 [53]XXXXXX
 University of Toronto Libraries, 2018 [55]XXXXXX
 Dalhousie University, 2019 [56]XXXX
 McGill University, 2019 [57]XXXX
 McMaster University, 2019 [58]XXXXX
 Prater, 2019 [59]XXXX
 Ryerson University Library, 2019 [60]XXXXX
 Université Laval, 2019 [61]XXXXX
 University of Cambridge, 2019 [62]XXXXXX
 University of Pretoria, 2019 [63]XXXXX
 University of Queensland Library, 2019 [65]XXXXX
 University of Queensland Library, 2019a [64]XXXX
 University of Witwatersrand, 2019 [66]XXXXXX
 Canadian Association of Research Libraries, ND [67]XXXXXX
 Columbia University Libraries, ND [68]XX
 Technion Library, ND [69]XXX
 UC Berkley, ND [70]XXXX
 University of Ottawa Scholarly Communication, ND [71]XXXX
Totals /30 checklists from university library websites (n, %)29, 9712, 4023, 7721, 7020, 6724, 80
Checklists from YouTube n = 10
 Robbins S. Western Sydney University, 2015 [72]XXXX
 Attia, 2017 [73]XXX
 Kysh, USC Keck School of Medicine, 2017 [74]XXXX
 McKenna, Rhodes University, 2017 [75]XXX
 Nicholson, University of Witwatersrand, 2017 [76]XXXXXX
 Raszewski 2017 [77]XXX
 Seal-Roberts, Springer Healthcare, 2017 [78]XXXX
 Menon, SCMS Group of Educational Institutions, India and Berryman, Cabells, 2018 [79]XXXXX
 Office of Scholarly Communication, Cambridge University, 2018 [80]XXXXX
 Weigand, 2018 UNC Libraries [81]XXX
Totals /10 checklists from YouTube (n, %)9, 903, 309, 908, 807, 704, 40
TOTAL (n, %)85, 9139, 4277, 8371, 9361, 6662, 67

*Categories as described by Cobey et al. 2018 [3]

Thematic categories covered by the checklists (oldest to most recently published) *Categories as described by Cobey et al. 2018 [3] Journal operations: 85 checklists (91%) assessed information on the journal’s operations. Assessment of previously published articles: 40 checklists (43%) included questions on the quality of articles published in the journal in question. Editorial and peer review process: 77 checklists (83%) included questions on the editorial and peer review process. Communication: 71 checklists (76%) included an assessment of the manners in which communication is set up between the journal / publisher and the author. Article processing charges: 61 checklists (66%) included an assessment of information on article processing charges. Dissemination, indexing and archiving: 62 checklists (67%) included suggested ways in which submitting authors should check for information on dissemination, indexing and archiving procedures of the journal and publisher. Across all 93 checklists, a mean of four out of the six thematic categories was covered, demonstrating similar themes covered by all checklists. Twenty percent of checklists (n = 19), including the reference checklist, covered all six categories [10, 13, 16, 19, 20, 26, 32, 34, 40, 42, 46, 53, 55, 62, 66, 67, 76, 85, 103]. Assessment of previously published articles was the category least often included in a checklist (n = 40, 43%), and a mention of the journal operations was the category most often included in a checklist (n = 85, 91%).

Discipline-specific journals

Of the checklists published in academic journals, 10 (22%) were published in nursing journals [25, 31, 35, 38, 41, 87, 89, 91–93], eight (18%) were published in journals related to general medicine [13, 16, 20, 22, 23, 34, 94, 95], four (9%) in emergency medicine journals [29, 36, 90, 100], four (9%) in information science journals [19, 30, 40, 82], four (9%) in psychiatry and behavioral science journals [18, 24, 83, 86]. The remaining checklists were published in a variety of other discipline-specific journals, within the field of biomedicine, with three or fewer checklists per discipline (e.g. specialty medicine, pediatric medicine, general medicine and surgery, medical education, and dentistry).

Discussion

Many authors have developed checklists specifically designed to identify predatory journals; the number of checklists developed has increased since 2012, with the majority of checklists published since 2015 (n = 81, 87%). Comparing the 93 identified checklists to the reference checklist, we observed that on average, the content of the checklist items were similar, including the categories or domains covered by the checklist; all checklists were also similar on the following: time to complete the checklist, number of items in the checklist (this number does vary considerably, however the average number of items is more consistent with the reference list), and lack of qualitative and quantitative guidance on completing the checklists. Furthermore, only 3% of checklists (n = 3) were deemed evidence-based, few checklists weighted any items (n = 2, 2%) and few checklists were developed through empirical study (n = 4, 4%). Of note, one of the checklists [33] was in a paper in a journal that is potentially predatory.

Summary of evidence

In total, we identified 93 checklists to help identify predatory journals and/or publishers. A search of electronic databases resulted in 53 original checklists, a search of library websites of top universities resulted in 30 original checklists and a filtered and limited search of YouTube returned 10 original checklists. Overall, checklists could be completed quickly, covered similar categories of topics and were lacking in guidance that would help a user determine if the journal or publisher was indeed predatory.

Strengths and limitations

We used a rigorous systematic review process to conduct the study. We also searched multiple data sources including published literature, university library websites, globally, and YouTube. We were limited by the languages of checklists we could assess (English, French and Portuguese). However, the majority of academic literature is published in English [105]. Thus, we are confident that we captured the majority of checklists or at least a representative sample. For reasons of feasibility, we were not able to capture all checklists available. Our reference checklist did not qualify as evidence-based when using our predetermined criteria to assess risk of bias, which could be because the list of characteristics in the reference list was not initially intended as a checklist per se. However, the purpose of the reference checklist was to serve as a reference point for readers, regardless of its qualification as evidence-based or not. Creating a useable checklist tool requires attention not only to the development of the content of items but also to other details, such as pilot testing and making the items succinct, as identified in our risk of bias criteria. This perhaps was not attended to by Shamseer et al. because of the difference in the intended purpose of their list. Our risk of bias tool was created based on other existing tools and developed through expertise of the authors. Although useful for the purpose of this exercise, the tool remains based on our expert judgment although it does include elements of scientific principles. We noted that the “Think. Check. Submit.” checklist [106] was referenced in many publications and we believe it is used often as guidance for authors to identify presumed legitimate journals. However, we did not include this checklist in our study because we excluded checklists that help to identify presumed legitimate publications. Instead, our specific focus was on checklists that help to detect potential predatory journals.

Conclusion

In our search for checklists to help authors identify potential predatory journals, we found great similarity across checklist media and across journal disciplines in which the checklists were published. Although many of the checklists were published in field-specific journals and / or addressed a specific audience, the content of the lists did not differ much. This could be reflective of the idea that checklist developers are all looking to identify the same items. Only a small proportion of the records included the empirical methods used to develop the checklists, and only a few checklists were deemed evidence-based according to our criteria. We noted that checklists with more items did not necessarily mean that it took longer to complete; this speaks to the level of complexity of some checklists versus others. Importantly, very few authors offered concrete guidance on using the checklists or offered any threshold that would guide authors to identify definitively if the journal was predatory. The lack of checklists providing threshold values could be due to the fact that a definition of predatory journals did not exist until this year [3, 4]. We identify a threshold value as important for the checklist’s usability. Without a recommended or suggested threshold value, checklist users may not feel confident to make a decision on submitting or not submitting to a journal. We are recommending a threshold value as a way for users to actively engage with the checklist and make it a practical tool. The provision of detailed requirements that would qualify a journal as predatory therefore would have been a challenge. With this large number of checklists in circulation, and the lack of explicit and exacting guidelines to identify predatory publications, are authors at continued risk of publishing in journals that do not follow best publication practices? We see some value in discipline-specific lists for the purpose of more effective dissemination. However, this needs to be balanced against the risk of confusing researchers and overloading them with choice [6]. If most of the domains in the identified checklists are similar across disciplines, would a single list, relevant in all disciplines, result in less confusion and maximize dissemination and enhance implementation? In our study, we found no checklist to be optimal. Currently, we would caution against any further development of checklists and instead provide the following as guidance to authors: Look for a checklist that: Provides a threshold value for criteria to assess potential predatory journals, e.g. if the journal contains these three checklist items then we recommend avoiding submission; Has been developed using rigorous evidence, i.e. empirical evidence that is described or referenced in the publication. We note that only one checklist [96] out of the 93 we assessed fulfills the above criteria. There may be other factors (length of time to complete, number of categories covered by the checklist, ease of access, ease of use or other) that may influence usability of the checklist. Using an evidence-based tool with a clear threshold for identifying potential predatory journals may help reduce the burden of research waste occurring as a result of the proliferation of predatory publications. Additional file 1. Peer Review of Electronic Search Strategies (PRESS) Checklist. Additional file 2. Search strategy for the Ovid database.

1. Did the developers of the checklist represent more than one stakeholder group (e.g. researchers, academic librarians, publishers)?

2. Did the developers report gathering any data for the creation of the checklist (i.e. conduct a study on potential predatory journals, carry out a systematic review, collect anecdotal data)?

3. Does the checklist meet at least one of the following criteria: 1- Has title that reflects its objectives; 2- Fits on one page; 3- Each item on the checklist is one sentence?

4. Was the checklist pilot-tested or trialed with front-line users (e.g. researchers, students, academic librarians)?

5. Did the authors report how many criteria in the checklist a journal must meet in order to be considered predatory?

  5 in total

1.  A credit-like rating system to determine the legitimacy of scientific journals and publishers.

Authors:  Jaime A Teixeira da Silva; Daniel J Dunleavy; Mina Moradzadeh; Joshua Eykens
Journal:  Scientometrics       Date:  2021-08-18       Impact factor: 3.238

Review 2.  Sex and gender considerations in reporting guidelines for health research: a systematic review.

Authors:  Amédé Gogovor; Hervé Tchala Vignon Zomahoun; Giraud Ekanmian; Évèhouénou Lionel Adisso; Alèxe Deom Tardif; Lobna Khadhraoui; Nathalie Rheault; David Moher; France Légaré
Journal:  Biol Sex Differ       Date:  2021-11-20       Impact factor: 5.027

Review 3.  What should urologists know about Pseudojournals and open access publishing? A narrative review of the literature.

Authors:  Rahul Jena; Aditya Prakash Sharma; Kumar Madhavan; Ashwin Narasimha Sridhar; Kalpesh Parmar; Nikita Shrivastava
Journal:  Indian J Urol       Date:  2022-07-01

4.  Dealing with predatory journal articles captured in systematic reviews.

Authors:  Danielle B Rice; Becky Skidmore; Kelly D Cobey
Journal:  Syst Rev       Date:  2021-06-11

5.  A welcome of the Immunologic Research's new editors.

Authors:  Nicola Bizzaro; Dieter Kabelitz
Journal:  Immunol Res       Date:  2021-08       Impact factor: 2.829

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.