| Literature DB >> 26267491 |
Vivian M Nguyen1, Neal R Haddaway2, Lee F G Gutowsky1, Alexander D M Wilson1, Austin J Gallagher3, Michael R Donaldson1, Neil Hammerschlag4, Steven J Cooke5.
Abstract
Delays in peer reviewed publication may have consequences for both assessment of scientific prowess in academics as well as communication of important information to the knowledge receptor community. We present an analysis on the perspectives of authors publishing in conservation biology journals regarding their opinions on the importance of speed in peer-review as well as how to improve review times. Authors were invited to take part in an online questionnaire, of which the data was subjected to both qualitative (open coding, categorizing) and quantitative analyses (generalized linear models). We received 637 responses to a total of 6,547 e-mail invitations sent. Peer-review speed was generally perceived as slow, with authors experiencing a typical turnaround time of 14 weeks while their perceived optimal review time is six weeks. Male and younger respondents seem to have higher expectations of review speed than females and older respondents. Majority of participants attributed lengthy review times to the 'stress' on the peer-review system (i.e., reviewer and editor fatigue), while editor persistence and journal prestige were believed to speed up the review process. Negative consequences of lengthy review times appear to be greater for early career researchers and can also have impact on author morale (e.g. motivation or frustration). Competition among colleagues were also of concern to respondents. Incentivizing peer review was among the top suggested alterations to the system along with training graduate students in peer review, increased editorial persistence, and changes to the norms of peer-review such as opening the peer-review process to the public. It is clear that authors surveyed in this study view the peer-review system as under stress and we encourage scientists and publishers to push the envelope for new peer review models.Entities:
Mesh:
Year: 2015 PMID: 26267491 PMCID: PMC4533968 DOI: 10.1371/journal.pone.0132557
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Reported opinions and experiences of peer-review durations.
| Category | Average time in weeks (mean ± SD | 25th percentile | Median | 75th percentile | Range (weeks) |
|---|---|---|---|---|---|
| Shortest or quickest review time reported | 5.1 ± 6.0 | 3 | 4 | 6 | 1–88 |
| Opinion of fast review time | 4.4± 2.9 | 3 | 4 | 4 | 1–26 |
| Longest or slowest review time reported | 31.5 ± 23.8 | 16 | 24 | 40 | 1–200 |
| Opinion of slow review time | 14.4 ± 8.2 | 8 | 12 | 16 | 1–100 |
| Typical turnaround time reported | 14.4 ± 6.0 | 7 | 10 | 12 | 1–54 |
| Opinion of optimal review time | 6.4 ± 4 | 4 | 6 | 8 | 1–52 |
*SD = Standard Deviation
Fig 1Male (solid line, triangular points) and female (dashed line, circular points) researcher opinion on the optimal number of weeks ± 95 CI for the review process given their experience (# of weeks for a “typical” review).
Fig 2Male (solid line, triangular points) and female (dashed line, circular points) researcher opinion on the time (# weeks ± 95% CI) considered for a short review turn around given their experience (# of weeks for a short review) and age category.
Fig 3Male (solid line, triangular points) and female (dashed line, circular points) researcher opinion on the time (# weeks ± 95% CI) considered for a long review given their experience (# of weeks for a long review).
Frequency (%) of respondents’ perspectives on the accountability of the duration of a review process and average score (mode) of the Likert type scale with a score of 1 being greatly slows review speed to score of 5 being greatly speeds up review.
| Accountability of peer review duration | Greatly slows review speed | Somewhat slows review speed | No impact | Somewhat speeds up review | Greatly speeds up review | Mode |
|---|---|---|---|---|---|---|
| Scientific significance for advancing the field of study (N = 461) | 1% | 10% | 46% | 34% | 9% | 3 |
| Conservation implications of results (N = 208) | 1% | 5% | 74% | 17% | 3% | 3 |
| Policy implications of results (N = 456) | 2% | 10% | 72% | 14% | 3% | 3 |
| Potential public interest or potential for media attention (N = 458) | 1% | 4% | 53% | 33% | 10% | 3 |
| Length of paper (N = 462) | 12% | 55% | 29% | 3% | 1% | 2 |
| Journal prestige or impact factor (N = 459) | 4% | 12% | 27% | 42% | 16% | 4 |
| Maximum 'allocated' review times for each journal (N = 454) | 10% | 21% | 25% | 34% | 10% | 4 |
| Persistence of editorial team (N = 460) | 3% | 10% | 18% | 44% | 25% | 4 |
| Number of reviewers (N = 464) | 22% | 58% | 13% | 7% | 2% | 2 |
| Editor fatigue (lack of time, etc.) (N = 465) | 51% | 42% | 5% | 1% | 1% | 1 |
| Reviewer fatigue (lack of time, etc.) (N = 467) | 71% | 26% | 1% | 1% | 1% | 1 |
Summary of suggestions to improve the peer review process and increase speed.
| Theme | Description | Approximate proportion of responses |
|---|---|---|
| Deadlines and defined policies | Shorted allocated time to review a manuscript and strict procedures to ensure adherence to deadlines | 30% |
| Referee reward system | Providing incentives and compensation for reviewers and editors | 25% |
| Editorial persistence | Proactivity from editors to send reminders, follow up with deadlines, and setting the tone. | 14% |
| Alternative responses | Permitting to submit to more than one journal, include early career researchers as reviewers, follow model of journals that do it well, bank or database of reviewers, have sub-reviewers (e.g. expertise for statistics, methods, taxa, tools, etc.) | 13% |
| Change norms of publishing | Author empowerment, journal standardization, open peer review, double blind reviews | 12% |
| Improved journal management | Overall management of editorial staff and inter-journal management | 6% |