| Literature DB >> 29622916 |
Malika Ihle1, Isabel S Winney1,2,3, Anna Krystalli1, Michael Croucher4.
Abstract
Science is meant to be the systematic and objective study of the world but evidence suggests that scientific practices are sometimes falling short of this expectation. In this invited idea, we argue that any failure to conduct research according to a documented plan (lack of reliability) and/or any failure to ensure that reconducting the same project would provide the same finding (lack of reproducibility), will result in a low probability of independent studies reaching the same outcome (lack of replicability). After outlining the challenges facing behavioral ecology and science more broadly and incorporating advice from international organizations such as the Center for Open Science (COS), we present clear guidelines and tutorials on what we think open practices represent for behavioral ecologists. In addition, we indicate some of the currently most appropriate and freely available tools for adopting these practices. Finally, we suggest that all journals in our field, such as Behavioral Ecology, give additional weight to transparent studies and therefore provide greater incentives to align our scientific practices to our scientific values. Overall, we argue that producing demonstrably credible science is now fully achievable for the benefit of each researcher individually and for our community as a whole.Entities:
Keywords: Acknowledging Open Practices; TOP guidelines.; badges; integrity; open science initiative; software; toolkit
Year: 2017 PMID: 29622916 PMCID: PMC5873838 DOI: 10.1093/beheco/arx003
Source DB: PubMed Journal: Behav Ecol ISSN: 1045-2249 Impact factor: 2.671
Recommended research process for the main study types in behavioral ecology, with the “minimum” open practices advisable for earning credibility
| Study type | Timeline | |||||
|---|---|---|---|---|---|---|
| Conception | Collection | Publication | Other research outputs | Cited for | Follow-up studies | |
| Experiment | Preregister | Maintain reproducible workflow | Separate confirmatory from exploratory analyses | Open raw data | Objective assessment of an hypothesis | Meta-analysis to quantify the generality of the finding |
| Open script for data processing | ||||||
| State and follow the 21 word solutiona | Open script for analyses | |||||
| Observational short or middle term | Write project proposal following TOP guidelines and preregistration checklists | Maintain reproducible workflow | State exploratory nature of the analysis | Open raw data Open script for data processing Open script for analyses | Novelty Discovery Hypothesis source | Preregistered replication to test the hypothesis generated |
| Briefly report entire exploration | ||||||
| State and follow the 21 word solutiona | ||||||
| Observational long term | Write project proposal following TOP guidelines and preregistration checklists | Maintain reproducible workflow | State exploratory nature of the analysis | Single study: Open selected raw data | Large sample size Wild population: relevant ecological and evolutionary context | Large-scale collaborations for high impact research |
| Briefly report entire exploration State and follow the 21 word solutiona | Open script for data processing after selection | |||||
| Open script for analyses | ||||||
| Full database: Open metadata (prompting preregistrations and subsequent data requests) |
All these aspects of project management can be carried out and centralized in the Open Science Framework. We highlight some reasons why each study is or would be valuable and commonly cited (either as a result of or without open practices). Further, we emphasize follow-ups that are facilitated or improved by open practices and that promote the impact of the initial research. The open practices are symbolized by “Badges to Akowledge Open Practices” developed by the Center for Open Science and acknowledging Preregistration, Open materials, and Open data, repectively (https://osf.io/tvyxz/wiki/home/). We call journal editors to display them on publications. Preregistration is advisable for all types of studies but alternatives are presented for observational studies where this might be either premature (i.e., when the study is exploratory) or more difficult (e.g., when the data have already been collected and screened by the data analyst). Greyed out badges represent these alternatives and/or cases where an open practice is still lacking incentives in Behavioral Ecology.
aThe 21 word solution: “We report how we determined our sample size, all data exclusions (if any), all manipulations, and all measures in the study.” (Simmons et al., 2012).