| Literature DB >> 30275983 |
Abstract
BACKGROUND: Associations were examined between author-reported uses of reporting guidelines to prepare JNCI: Journal of the National Cancer Institute (JNCI) submissions, editorial decisions, and reviewer ratings for adherence to reporting guidelines and clarity of presentation.Entities:
Keywords: Adherence; Clarity; Editorial decisions; Peer review; Presentation; Reporting guidelines; Submissions
Year: 2018 PMID: 30275983 PMCID: PMC6158810 DOI: 10.1186/s41073-018-0052-4
Source DB: PubMed Journal: Res Integr Peer Rev ISSN: 2058-8615
Fig. 1Numbers of submissions for which authors said they used a reporting guideline or did not. Standard reporting guideline (SRG); Strengthening-Reporting of Observational-Studies in Epidemiology (STROBE) [11]; Animal Research: Reporting In Vivo Experiments (ARRIVE) [6]; Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) [13]; Consolidated Standards of Reporting Trials (CONSORT) [15]; REporting recommendations for tumour MARKer prognostic studies (REMARK) [7]; Preferred Reporting Items for Systematic Reviews and Meta-analyses (PRISMA) [10]; studies of diagnostic accuracy (STARD) [8]; Meta-analyses of Observational Studies (MOOSE) [9]; Biospecimen reporting for improved study quality (BRISQ) [14]; STrengthening the REporting of Genetic Association Studies (STREGA) [12], an extension to STROBE; and Consolidated Health Economic Evaluation Reporting Standards (CHEERS) [16]
Submissions by editorial decision and by the reporting guidelines authors said they used
| Editorial decision | All Submissions | STROBE | ARRIVE | MIQE | CONSORT | REMARK | PRISMA | STARD | MOOSE | BRISQ | STREGA | CHEERS |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| All, no. (%) | 2209 (100) | 531 (24.0) | 150 (6.8) | 54 (2.4) | 96 (4.3) | 140 (6.3) | 133 (6.0) | 65 (2.9) | 26 (1.2) | 34 (1.5) | 23 (1.0) | 4 (0.2) |
| Sent to peer review | 396 (17.9) | 86 (3.9) | 28 (1.3) | 5 (0.2) | 23 (1.0) | 24 (1.1) | 25 (1.1) | 10 (0.5) | 7 (0.3) | 4 (0.2) | 7 (0.3) | 0 (0.0) |
| Rejected after peer review | 255 (11.5) | 36 (1.6) | 14 (0.6) | 3 (0.1) | 7 (0.3) | 12 (0.5) | 16 (0.7) | 5 (0.2) | 1 (0.0) | 3 (0.1) | 4 (0.2) | 0 (0.0) |
| Not rejected after peer review | 141 (6.4) | 50 (2.3) | 14 (0.6) | 2 (0.1) | 16 (0.7) | 12 (0.5) | 9 (0.4) | 5 (0.2) | 6 (0.3) | 1 (0.0) | 3 (0.1) | 0 (0.0) |
| Not sent to peer review | 1813 (82.1) | 445 (20.1) | 122 (5.5) | 49 (2.2) | 73 (3.3) | 116 (5.3) | 108 (4.9) | 55 (2.5) | 19 (0.9) | 30 (1.4) | 16 (0.7) | 4 (0.2) |
| Reject without review | 98 (4.4) | 22 (1.0) | 1 (0.0) | 1 (0.0) | 2 (0.1) | 2 (0.1) | 8 (0.4) | 2 (0.1) | 1 (0.0) | 2 (0.1) | 0 (0.0) | 2 (0.1) |
| Priority reject | 1715 (77.6) | 423 (19.1) | 121 (5.5) | 48 (2.2) | 71 (3.2) | 114 (5.2) | 100 (4.5) | 53 (2.4) | 18 (0.8) | 28 (1.3) | 16 (0.7) | 2 (0.1) |
STROBE Strengthening-Reporting of Observational-Studies in Epidemiology, ARRIVE Animal Research: Reporting In Vivo Experiments, MIQE Minimum Information for Publication of Quantitative Real-Time PCR Experiments, CONSORT Consolidated Standards of Reporting Trials, REMARK Reporting recommendations for tumour Marker prognostic studies, PRISMA Preferred Reporting Items for Systematic Reviews and Meta-analyses, STARD studies of diagnostic accuracy, MOOSE Meta-analyses of Observational Studies, BRISQ Biospecimen reporting for improved study quality, STREGA Strengthening the Reporting of Genetic Association Studies , an extension to STROBE; and Consolidated Health Economic Evaluation Reporting Standards (CHEERS)
Use of reporting guidelines scores across editorial decisions
| Editorial decision | Mean scorea (SD) |
|
|---|---|---|
| Rejected without peer review | 0.53 (0.50) | |
| Rejected after peer review | 0.53 (0.50) | .68 |
| Not rejected after peer review | 0.49 (0.50) | .47 |
aSubmissions were scored according to the following rules: authors indicated they used a reporting guideline to prepare their submission, 1; authors indicated they did not use a reporting guideline to prepare their submission, 0. P values were calculated using a two-sided paired t test
Reviewer rating scores across editorial decisionsa
| Editorial decision | Adherence to reporting guidelines | Clarity of presentation | ||
|---|---|---|---|---|
| Reviewer rating score, mean (SD) |
| Reviewer rating score, mean (SD) |
| |
| Rejected after peer review | 2.9 (1.57) | 3.1 (1.08) | ||
| Not rejected after peer review | 3.2 (1.61) | .005 | 3.6 (1.00) | < .001 |
aReviewer rating, score: not applicable, 0; fair, 1; poor, 2; good, 3; very good, 4; and outstanding, 5. P values were calculated using a two-sided paired t test
Reviewer rating scores according to author’s claim of using reporting guidelines
| Reviewer question | Author said they used a reporting guideline to prepare their submission | ||
|---|---|---|---|
| No | Yes |
| |
| Adherence to reporting guidelines | 2.9 (1.70) | 3.1 (1.48) | .01 |
| Clarity of presentation | 3.3 (1.10) | 3.3 (1.04) | .64 |
Authors reported using the following reporting guidelines: STROBE Strengthening-Reporting of Observational-Studies in Epidemiology, ARRIVE Animal Research: Reporting In Vivo Experiments, MIQE Minimum Information for Publication of Quantitative Real-Time PCR Experiments, CONSORT Consolidated Standards of Reporting Trials, REMARK Reporting recommendations for tumour Marker prognostic studies, PRISMA Preferred Reporting Items for Systematic Reviews and Meta-analyses, STARD Studies of diagnostic accuracy, MOOSE Meta-analyses of Observational Studies, BRISQ Biospecimen reporting for improved study quality, STREGA STrengthening the REporting of Genetic Association Studies, an extension to STROBE; and CHEERS Consolidated Health Economic Evaluation Reporting Standards. Some percentages do not add to 100 owing to rounding. Numerical values were given to each answer (SRG use, 1; no SRG use, 0) or reviewer rating (not applicable, 0; fair, 1; poor, 2; good, 3; very good, 4; and outstanding, 5), and mean scores are presented. P values were calculated using a two-sided paired t test