| Literature DB >> 31548409 |
Kathleen Hall Jamieson1, Marcia McNutt2, Veronique Kiermer3, Richard Sever4.
Abstract
Trust in science increases when scientists and the outlets certifying their work honor science's norms. Scientists often fail to signal to other scientists and, perhaps more importantly, the public that these norms are being upheld. They could do so as they generate, certify, and react to each other's findings: for example, by promoting the use and value of evidence, transparent reporting, self-correction, replication, a culture of critique, and controls for bias. A number of approaches for authors and journals would lead to more effective signals of trustworthiness at the article level. These include article badging, checklists, a more extensive withdrawal ontology, identity verification, better forward linking, and greater transparency.Entities:
Keywords: scientific integrity; signaling trustworthiness; transparency
Year: 2019 PMID: 31548409 PMCID: PMC6765233 DOI: 10.1073/pnas.1913039116
Source DB: PubMed Journal: Proc Natl Acad Sci U S A ISSN: 0027-8424 Impact factor: 11.205
Fig. 1.A national probability sample of 1,253 US adults conducted for the Annenberg Public Policy Center of the University of Pennsylvania via telephone by Social Science Research Solutions from January 30 to February 7, 2019. The margin of error is ±3.42% at the 95% confidence level. Total cellular phone respondents were 854 (68% of the sample) while 399 respondents (32%) completed the survey using a landline. There were 39 respondents (3%) who completed the survey in Spanish. The response rate, which was calculated using the American Association for Public Opinion Research’s Response Rate 3 formula, was 7%. See the and Dataset S1 for additional details.
Signaling the trustworthiness of studies
| Dimension | Norms | Example of violation | Signaling trust at the study level | Role of stakeholders |
| Competence | Bias-minimizing and power-optimizing study design | Insufficient power, selective sampling, absence of bias- controlling measures (e.g., blinding, randomization) | Signal that study meets standards of reporting transparency | Researchers increase trust by reporting on ways they meet norms and sharing details of methods, code, and data—by providing links with PIDs. |
| Reliance on statistics | p-hacking (29) | Signal that statistical review has been conducted | ||
| Use of reliable reagents | Use of invalidated biological reagents | Report on reagent validation | Journals and publishing platforms develop and enforce reporting standards and make clear what has been verified in review. As such they become trusted vehicles. | |
| Distinction between exploratory and confirmatory studies | HARKing (30), use of inferential statistics in exploratory studies, outcome switching | Preregistration of hypothesis-testing studies; clarity about post hoc analyses | ||
| Conclusions supported by data | Hyped results, data not available | Modest reporting, analysis transparency, data available (with persistent identifier) | Research institutions provide education, environment, and infrastructure to support best practices (e.g., testing reagents, managing data, incentivizing rigor in tenure and promotion). | |
| Integrity | Transparency of competing interests | Hidden interests with potential to influence study outcome | Disclosure of competing interests | Journals ensure independence of peer reviewers, use ORCID to attribute activities to individuals, and use available technology for checks. |
| Validation by peer review | Failure to open data, methods, materials to scrutiny | Open data and materials policies | ||
| Subversion of peer review (e.g., reviewer rings) | Publish/verify identity of reviewers | |||
| Ethical treatment of research subjects and animals | Inadequate or nonexistent informed consent | Report on IRB approval and permits obtained | Research institutions provide infrastructure such as IRB with relevant expertise. | |
| Failure to obtain research permits | Research institutions and funders make research ethics a condition of support, provide education, and facilitate access to permit- granting organizations. | |||
| Benevolence | Disinterestedness | Financial, personal, or political interests leading to false or selective reporting | Expressions of concern and retractions issued when misconduct is demonstrated or results do not support conclusions | Authors increase trust by being transparent about potential competing interests. |
| Research institutions investigate misconduct fairly and rapidly, report the outcomes, and protect whistleblowers. | ||||
| Journals collaborate with institutions to investigate allegations of data falsification and act to protect the record. | ||||
| Overarching structures (e.g., National Academies) establish norms and arbitration mechanisms (e.g., ombuds). |
The table presents a variety of mechanisms that can help communicate the level of trust warranted by an individual study. Although, in isolation, none is a fail-safe, collectively signaling their presence should help certify the trustworthiness of a study, researcher, or field of inquiry. HARKing (hypothesizing after the results are known) presents a post hoc hypothesis as if it were an a priori one (30); p-hacking is a form of data dredging that occurs “when researchers try out several statistical analyses and/or data eligibility specifications and then selectively report those that produce significant results” (29); IRB, institutional review board; PIDs, persistent identifiers.
Fig. 2.Examples of badges offered by the Center for Open Science that can be adopted by journals. Badges recognize those studies that meet standards for open data, open materials, and preregistration (https://cos.io/).