| Literature DB >> 32257301 |
Tom E Hardwicke1,2, Joshua D Wallach3,4,5, Mallory C Kidwell6, Theiss Bendixen7, Sophia Crüwell1, John P A Ioannidis1,2,8.
Abstract
Serious concerns about research quality have catalysed a number of reform initiatives intended to improve transparency and reproducibility and thus facilitate self-correction, increase efficiency and enhance research credibility. Meta-research has evaluated the merits of some individual initiatives; however, this may not capture broader trends reflecting the cumulative contribution of these efforts. In this study, we manually examined a random sample of 250 articles in order to estimate the prevalence of a range of transparency and reproducibility-related indicators in the social sciences literature published between 2014 and 2017. Few articles indicated availability of materials (16/151, 11% [95% confidence interval, 7% to 16%]), protocols (0/156, 0% [0% to 1%]), raw data (11/156, 7% [2% to 13%]) or analysis scripts (2/156, 1% [0% to 3%]), and no studies were pre-registered (0/156, 0% [0% to 1%]). Some articles explicitly disclosed funding sources (or lack of; 74/236, 31% [25% to 37%]) and some declared no conflicts of interest (36/236, 15% [11% to 20%]). Replication studies were rare (2/156, 1% [0% to 3%]). Few studies were included in evidence synthesis via systematic review (17/151, 11% [7% to 16%]) or meta-analysis (2/151, 1% [0% to 3%]). Less than half the articles were publicly available (101/250, 40% [34% to 47%]). Minimal adoption of transparency and reproducibility-related research practices could be undermining the credibility and efficiency of social science research. The present study establishes a baseline that can be revisited in the future to assess progress.Entities:
Keywords: meta-research; open science; reproducibility; social sciences; transparency
Year: 2020 PMID: 32257301 PMCID: PMC7062098 DOI: 10.1098/rsos.190806
Source DB: PubMed Journal: R Soc Open Sci ISSN: 2054-5703 Impact factor: 2.963
Measured variables. The variables measured for an individual article depended on the study design classification. Additionally, for articles that were not available (the full text could not be retrieved), no other variables were measured. The exact operational definitions and procedures for data extraction/coding are available in the structured form here: https://osf.io/v4f59/.
| applicable study designs | |
|---|---|
| accessibility and retrieval method (can the article be accessed, is there a public version or is paywall access required?) | all |
| availability statement (is availability, or lack of, explicitly declared?) | empirical studiesa, commentaries and meta-analyses |
| content (what aspects of the study are included in the protocol?) | |
| availability statement (is availability, or lack of, explicitly declared?) | empirical studiesa |
| retrieval method (e.g. upon request or via online repository) | |
| accessibility (can the materials be accessed?) | |
| availability statement (is availability, or lack of, explicitly declared?) | empirical studiesa, commentaries and meta-analyses |
| retrieval method (e.g. upon request or via online repository) | |
| accessibility (can the data be accessed?) | |
| content (has all relevant data been shared?) | |
| documentation (are the data understandable?) | |
| availability statement (is availability, or lack of, explicitly declared?) | empirical studiesa, commentaries and meta-analyses |
| retrieval method (e.g. upon request or via online repository) | |
| accessibility (can the scripts be accessed?) | |
| availability statement (is availability, or lack of, explicitly declared?) | empirical studiesa, commentaries and meta-analyses |
| retrieval method (which registry was used?) | |
| accessibility (can the pre-registration be accessed?) | |
| content (what was pre-registered?) | |
| disclosure statement (are funding sources, or lack of, explicitly declared?) | all |
| conflicts of interest | |
| disclosure statement (are conflicts of interest, or lack of, explicitly declared?) | all |
| statement (does the article claim to report a replication?) | all |
| citation history (has the article been cited by a study that claims to be a replication?) | empirical studiesa |
| meta-analysis citation history (has the article been cited by, and included in the evidence-synthesis component of, a meta-analysis) | empirical studiesa |
| systematic review citation history (has the article been cited by, and included in the evidence-synthesis component of, a systematic review) | empirical studiesa |
| subject area, year of publication, study design, country of origin (based on corresponding author's affiliation), human/animal subjects, 2016 journal impact factor (according to Thomson Reuters Journal Citation Reports) | all |
a‘Empirical studies’ encompasses the following study design classifications: field studies, laboratory studies, surveys, case studies, multiple types, clinical trials and ‘other’ designs.
Sample characteristics for the 236 accessible articles.
| education | 34 |
| geography, planning and development | 18 |
| sociology and political science | 16 |
| cultural studies | 14 |
| economics and econometrics | 14 |
| business and international management | 12 |
| finance | 10 |
| 44 other social sciences subject-areasa (accounting for less than 10 per area) | 118 |
| 2014 | 63 |
| 2015 | 84 |
| 2016 | 69 |
| 2017 | 20 |
| no empirical data | 80 |
| field study | 73 |
| survey | 39 |
| multiple types | 22 |
| case study | 9 |
| laboratory study | 7 |
| commentary with analysis | 3 |
| meta-analysis | 2 |
| other | 1 |
| USA | 76 |
| UK | 25 |
| Australia | 17 |
| Germany | 13 |
| 47 other countriesb (accounting for less than 10 per country) | 105 |
| human | 105 |
| animal | 0 |
| neither human nor animal subjects involved | 131 |
aFor all subject areas, see https://osf.io/7fm96/.
bFor all countries, see https://osf.io/kg7j5/.
Figure 1.Assessment of transparency and reproducibility-related research practices in the social sciences. Numbers inside bars indicate raw counts. ‘N’ refers to total articles assessed for a given indicator (which was contingent on study design, table 1).