| Literature DB >> 28902887 |
SeungHye Han1, Tolani F Olonisakin1, John P Pribis2, Jill Zupetic1, Joo Heung Yoon1, Kyle M Holleran3, Kwonho Jeong3, Nader Shaikh4,5, Doris M Rubio3,5,6, Janet S Lee1,5,7.
Abstract
Irreproducibility of preclinical biomedical research has gained recent attention. It is suggested that requiring authors to complete a checklist at the time of manuscript submission would improve the quality and transparency of scientific reporting, and ultimately enhance reproducibility. Whether a checklist enhances quality and transparency in reporting preclinical animal studies, however, has not been empirically studied. Here we searched two highly cited life science journals, one that requires a checklist at submission (Nature) and one that does not (Cell), to identify in vivo animal studies. After screening 943 articles, a total of 80 articles were identified in 2013 (pre-checklist) and 2015 (post-checklist), and included for the detailed evaluation of reporting methodological and analytical information. We compared the quality of reporting preclinical animal studies between the two journals, accounting for differences between journals and changes over time in reporting. We find that reporting of randomization, blinding, and sample-size estimation significantly improved when comparing Nature to Cell from 2013 to 2015, likely due to implementation of a checklist. Specifically, improvement in reporting of the three methodological information was at least three times greater when a mandatory checklist was implemented than when it was not. Reporting the sex of animals and the number of independent experiments performed also improved from 2013 to 2015, likely from factors not related to a checklist. Our study demonstrates that completing a checklist at manuscript submission is associated with improved reporting of key methodological information in preclinical animal studies.Entities:
Mesh:
Year: 2017 PMID: 28902887 PMCID: PMC5597130 DOI: 10.1371/journal.pone.0183591
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Evaluation of journal standards and requirements as of July 2015.
| Is the Journal a member of endorsing Associations, Journals, and Societies? | Yes | Yes |
| Does the journal contain a section in the Information for Authors regarding journal's policies for statistical analysis and guidelines for rigorous reporting of study design? | Yes | Yes |
Requirement for statistics to be fully reported in the paper? Statement of statistical test used for each relevant figure presented? Statement of exact value of N for each relevant figure presented? Statements for data definition of center, dispersion and precision measures (e.g., mean, median, SD, SEM, confidence intervals)? | Recommended | Required |
| Recommended | Required | |
| Recommended | Required | |
| Required | Required | |
Investigators are required to report how often each experiment was performed? Investigators are required to report sufficient information about sample collection to distinguish between independent biological data points and technical replicates? | Recommended | Required |
| Recommended | Required | |
Statements whether the samples were randomized and specify method of randomization, at a minimum for all animal experiments? | Recommended | Required |
Statements whether experimenters were blind to group assignment and outcome assessment, at a minimum for all animal experiments? | Recommended | Required |
Require authors to state whether an appropriate sample size was computed when the study was being designed and include the statistical method of computation. If no power analysis was used, include how the sample size was determined. | Recommended | Required |
Require authors to clearly state the criteria that were used for exclusion of any data or subjects, if any were excluded? | Recommended | Required |
Require reporting source, species, strain, sex, age, husbandry, inbred and strain characteristics of transgenic animals? Require ethical governing committee approval of animal use? | Recommended | Required |
| Required | Required | |
Encourage the use of community-based standards (such as nomenclature standards and reporting standards like ARRIVE), where applicable? Provide a checklist form that is made public to authors and reviewers during submission? | Yes | Yes |
| No | Yes |
Fig 1Outline of the study.
(A) Selection of articles: Twenty consecutive articles that met the inclusion criteria among those published beginning in January for both 2013 and 2015 in Nature (one that implemented a pre-submission checklist) and Cell (one that did not) journals. This represents articles from periods of time before and after the implementation of the checklist in May 2013. (B) Flow of the analysis: To examine whether quality of reporting has improved over time, the degree of key information reported in 2015 was compared to that in 2013 in both journals combined (Objective 1). To assess whether a checklist is associated with improved quality in reporting, we first compared the changes over time observed in Nature (④ vs. ③). If there was significant difference, we compared time “2015 vs. 2013” in Cell (② vs. ①) and Nature vs. Cell within 2013 (③ vs. ①) and 2015 (④ vs. ②) to adjust for differences between journals and changes over time in reporting (Objective 2).
Data abstraction form to assess the quality and transparency in reporting.
| Ratings | |||
|---|---|---|---|
| 2 | 1 | 0 | |
| Source | Yes, all reported | Yes, some reported | No, not reported |
| Species | Yes, all reported | Yes, some reported | No, not reported |
| Strain | Yes, all reported | Yes, some reported | No, not reported |
| Sex | Yes, all reported | Yes, some reported | No, not reported |
| Age | Yes, all reported | Yes, some reported | No, not reported |
| Inbred and strain characteristics of genetically modified animals detailed? | Yes, all reported | Yes, some reported | No, not reported |
| N/A (genetically modified animals not used) | |||
| Identification of committee approving the animal experiments? | Yes | No | |
| Define statistical test used for each relevant figure presented? | Yes, all reported | Yes, some reported | No, not reported |
| Define test as one-sided or two-sided? | Yes, all reported | Yes, some reported | No, not reported |
| Exact value of N for each relevant figure presented? | Yes, all reported | Yes, some reported | No, not reported |
| Definition of center, dispersion and precision measures in figures (e.g., mean, median, SD, SEM, confidence intervals)? | Yes, all reported | Yes, some reported | No, not reported |
| Yes, all reported | Yes, some reported | No, not reported | |
| Yes, all reported | Yes, some reported | No, not reported | |
| Yes, reported and performed | Yes, reported but not performed | No, not reported | |
| Yes, reported and performed | Yes, reported but not performed | No, not reported | |
| Yes, reported and performed | Yes, reported but not performed | No, not reported | |
Fig 2Distribution of reporting study designs across time.
The distributions of the reporting status are presented in stacked bar graphs. The numbers inside the stacks are the number of articles corresponding to each percentage. The data for 2013 and 2015 are the total numbers of articles assessed from Cell and Nature within a given year. Fisher exact test was performed to assess the difference in reporting each methodological across time. Significant P values (< 0.05) are provided.
Fig 3The changes in rigorous reporting of study designs by a checklist.
The numbers inside the pie charts are the number of articles corresponding to each category. P values < 0.10 using Fisher exact test are provided to compare time 2015 vs. 2013 within the intervention (Nature) or the comparison (Cell) group, or to compare intervention vs comparison group within 2013 and 2015, respectively. ≠ is shown where comparisons between the touching two groups are significantly different with P < 0.05.