Literature DB >> 26664691

Poor methodological detail precludes experimental repeatability and hampers synthesis in ecology.

Neal R Haddaway1, Jos T A Verhoeven2.   

Abstract

Despite the scientific method's central tenets of reproducibility (the ability to obtain similar results when repeated) and repeatability (the ability to replicate an experiment based on methods described), published ecological research continues to fail to provide sufficient methodological detail to allow either repeatability of verification. Recent systematic reviews highlight the problem, with one example demonstrating that an average of 13% of studies per year (±8.0 [SD]) failed to report sample sizes. The problem affects the ability to verify the accuracy of any analysis, to repeat methods used, and to assimilate the study findings into powerful and useful meta-analyses. The problem is common in a variety of ecological topics examined to date, and despite previous calls for improved reporting and metadata archiving, which could indirectly alleviate the problem, there is no indication of an improvement in reporting standards over time. Here, we call on authors, editors, and peer reviewers to consider repeatability as a top priority when evaluating research manuscripts, bearing in mind that legacy and integration into the evidence base can drastically improve the impact of individual research reports.

Entities:  

Keywords:  Evidence synthesis; experimental design; meta‐analysis; reliability; research legacy; susceptibility to bias; systematic review; transparency

Year:  2015        PMID: 26664691      PMCID: PMC4667817          DOI: 10.1002/ece3.1722

Source DB:  PubMed          Journal:  Ecol Evol        ISSN: 2045-7758            Impact factor:   2.912


The Problem

A central tenet in scientific research is that theories should be testable and refutable (Popper 1968) and experiments that test these theories should be repeatable (Gurevitch et al. 2001; Koricheva 2003). Research repeatability through transparent description of study design and methodology is paramount to ensuring reliability of study findings. Related to this, reproducibility refers to the ability to obtain a similar finding when repeating a method (Slezák and Waczulíková 2011). Despite this universally accepted logic, we have found that poor methodological detail in published ecological research is common and threatens its robustness, impact, and legacy. Critical appraisal is a key requirement of systematic reviews; robust approaches to reviewing existing research evidence using strict methods set out by review coordinating bodies, such as the Cochrane Collaboration (www.cochrane.org) or the Collaboration for Environmental Evidence (www.environmentalevidence.org). The opportunities for reaching new overall conclusions on pressing fundamental and applied research questions have grown considerably with the availability of new statistical approaches for meta‐analysis. However, through our experience of critical appraisal of large bodies of evidence, we commonly find published academic research articles that do not provide sufficient methodological detail for studies to be repeated. For example, a systematic map of the environmental and socioeconomic impacts of high altitude land abandonment identified 111 of 190 studies as being described with low methodological detail (Haddaway et al. 2014a). More specifically, 38 studies failed to report the timing of investigation, 40 studies failed to report intervention duration, 28 studies failed to describe the degree of replication, and 105 studies did not describe the spatial scale over which experiments took place. Similarly, a systematic review of the impacts of land management on greenhouse gas and carbon flux in boreo‐temperate lowland peatlands found 39 of 140 studies to have poor methodological detail: for example, not stating the timescale of management activities, the period or timing of sampling, giving no indication of the number of replicates used, and failing to describe the relative locations of control and treatment areas (Haddaway et al. 2014b). Two similar systematic reviews of the quantitative removal of total nitrogen and phosphorus from water flowing through constructed wetlands in the British Isles (Palmer‐Felgate et al. 2013) and globally (Land et al. 2013) encountered poor methodological detail. In one of these reviews, 67 of 121 studies provided insufficient methodological detail and/or statistical design to allow for meaningful synthesis (Land et al. 2013). In a final example, an ongoing systematic review of the impacts of farmland management on soil organic carbon (Söderström et al. 2014), 70 of 500 studies failed to state their experimental design (i.e., split plot, randomized block). These problems are not simply restricted to systematic reviews: Similar problems with missing information occur in meta‐analyses (e.g., Garssen et al. 2014), which are a widely used synthesis tool in ecology (Stewart 2010). Previous authors have raised similar concerns over the need to make primary data available (Whitlock 2011) and that this should be accompanied by clear metadata (Michener et al. 1997). One recent systematic review of the impacts of agricultural land management on soil organic carbon found missing data to remain a significant problem over the last 22 years, with an average of 13% of studies per year (±8.0 [SD]) failing to report sample size, for example (Fig. 1). Many journals have responded positively and now require data to be archived alongside primary research articles. These calls relate to analytical reproducibility (the ability to reach the same conclusions) and not experimental repeatability (the ability to repeat the experiment described). Without explicit details of experimental design, the science behind the study cannot be repeated and study results cannot be synthesized.
Figure 1

Missing information in study methods (expressed as the percentage of total studies published in each year) precluding replication across five domains for studies included in a systematic review of the impacts of agricultural management on soil organic carbon (unpublished data based on data from an ongoing systematic review according to Söderström et al. 2014, currently in review). Also displayed are the total number of studies published in each year (secondary y‐axis).

Missing information in study methods (expressed as the percentage of total studies published in each year) precluding replication across five domains for studies included in a systematic review of the impacts of agricultural management on soil organic carbon (unpublished data based on data from an ongoing systematic review according to Söderström et al. 2014, currently in review). Also displayed are the total number of studies published in each year (secondary y‐axis). Improving the availability of primary data is not an adequate solution for the problem of inadequate methodological detail. While missing quantitative data (such as means and standard deviations) prevent reliable studies from being included in statistical meta‐analysis (Haddaway 2015), missing methodological data prevent a judgement concerning reliability of the study from being made. Methods are available that allow studies missing quantitative data to be included to varying degrees in meta‐analyses, typically referred to as imputation (e.g., Furukawa et al. 2006). Even the studies that cannot be included through imputation “count” in syntheses because their existence is noted and their results can be discussed in a narrative. Studies that lack critical methodological detail, however, should not be included because their results may be unreliable. The relative risk of this unreliability depends upon the type of information missing, and “gap filling” methods can, in part, help reviewers (see Potential Solutions, below).

Potential Solutions

Several solutions exist where primary research is missing methodological information: contact corresponding authors with requests for information check whether related manuscripts have been published for the same experiment and extract methodological details where methods can be reliably assumed to be the same perform sensitivity analysis in meta‐analysis to examine the influence of studies missing vital methodological information (but that also provide sufficient quantitative data) once found, publish missing information in a dedicated database (e.g., SRDR [http://srdr.ahrq.gov] or postpublication platform such as PubPeer (https://pubpeer.com) making it easier for future readers to find promote improved reporting standards in the long term promote current journal guidelines (e.g., Hillebrand and Gurevitch 2013) and establish universal mandates for methodological detail improve instructions to peer reviewers to ensure they screen manuscripts for methodological repeatability increase awareness of the importance of repeatability, particularly with respect to secondary synthesis and its benefits to legacy and impact. The first three options are broadly suitable and require minimal effort. However, response rates for email addresses that are older than 3 to 5 years may be expected to be particularly low given movement of researchers between institutions. Options 4 and 5 may require considerable effort and require collective effort by the scientific community. Similar concerns regarding missing methodological information have been raised in other disciplines (Altman 2015) and in relation to missing quantitative data that preclude further synthesis (Lajeunesse and Forbes 2003; Hillebrand and Gurevitch 2013; Ellington et al. 2015). We echo these calls by encouraging the research community to ensure that research is described in a way that is truly repeatable. Based on our experiences of critical appraisal in systematic reviews, we recommend the following minimum requirements be observed for manuscripts documenting experimental and quasi‐experimental studies: experimental setting field studies: detailed study location (latitude and longitude), influential climatic conditions laboratory studies: controlled conditions (temperature, light cycle, influential reagents) study date(s) and duration selection procedures for sample selection and treatment allocation (purposeful, randomization, blocking, etc.) level of true replication level of subsampling (number and repeat or within‐replicate sampling) sampling precision (within‐replicate sampling or pseudoreplication1) study spatial scale (size of replicates and spatial scale of study area) study design (i.e., before–after, control–impacts, time series, before–after‐control–impacts) outcome measurement methods and equipment description of any data manipulation, modeling, or statistical analysis undertaken These are not onerous requirements, and despite being the subject of previous calls to adequately document archived data, we must reiterate the need for this information again to ensure legacy of primary research is maximized. We also advocate calls for better reporting of summary data (i.e., means, variability, and sample sizes) that permit meta‐analysis (e.g., Haddaway 2015), a valuable method in synthesizing results from multiple studies (Stewart 2010). Inclusion of these details will ensure study results are truly verifiable and have a legacy and impact beyond acting as a case study. As this information is readily available to authors, its required inclusion should be clearly specified in the “guide for authors” of peer‐reviewed journals and checked by journal referees. If a more strict code of conduct could become common practice in scientific reporting, the feasibility and success of large meta‐analyses and systematic reviews would be greatly enhanced.

Conflict of Interest

None declared.
  8 in total

1.  Data archiving in ecology and evolution: best practices.

Authors:  Michael C Whitlock
Journal:  Trends Ecol Evol       Date:  2010-12-14       Impact factor: 17.712

2.  Imputing missing standard deviations in meta-analyses can provide accurate results.

Authors:  Toshi A Furukawa; Corrado Barbui; Andrea Cipriani; Paolo Brambilla; Norio Watanabe
Journal:  J Clin Epidemiol       Date:  2006-01       Impact factor: 6.437

3.  Reporting standards in experimental studies.

Authors:  Helmut Hillebrand; Jessica Gurevitch
Journal:  Ecol Lett       Date:  2013-09-30       Impact factor: 9.492

4.  Meta-analysis in applied ecology.

Authors:  Gavin Stewart
Journal:  Biol Lett       Date:  2009-09-23       Impact factor: 3.703

5.  Reproducibility and repeatability.

Authors:  P Slezák; I Waczulíková
Journal:  Physiol Res       Date:  2011       Impact factor: 1.881

6.  A call for better reporting of conservation research data for use in meta-analyses.

Authors:  Neal R Haddaway
Journal:  Conserv Biol       Date:  2015-01-14       Impact factor: 6.560

7.  Effects of climate-induced increases in summer drought on riparian plant species: a meta-analysis.

Authors:  Annemarie G Garssen; Jos T A Verhoeven; Merel B Soons
Journal:  Freshw Biol       Date:  2014-02-16       Impact factor: 3.809

8.  Making research articles fit for purpose: structured reporting of key methods and findings.

Authors:  Douglas G Altman
Journal:  Trials       Date:  2015-02-20       Impact factor: 2.279

  8 in total
  8 in total

Review 1.  The role of the IACUC in ensuring research reproducibility.

Authors:  Jerald Silverman; James Macy; Patricia A Preisig
Journal:  Lab Anim (NY)       Date:  2017-03-22       Impact factor: 12.625

2.  A guide to pre-processing high-throughput animal tracking data.

Authors:  Pratik Rajan Gupte; Christine E Beardsworth; Orr Spiegel; Emmanuel Lourie; Sivan Toledo; Ran Nathan; Allert I Bijleveld
Journal:  J Anim Ecol       Date:  2021-11-16       Impact factor: 5.606

3.  How experimental biology and ecology can support evidence-based decision-making in conservation: avoiding pitfalls and enabling application.

Authors:  Steven J Cooke; Kim Birnie-Gauvin; Robert J Lennox; Jessica J Taylor; Trina Rytwinski; Jodie L Rummer; Craig E Franklin; Joseph R Bennett; Neal R Haddaway
Journal:  Conserv Physiol       Date:  2017-08-09       Impact factor: 3.079

Review 4.  Comparative Medicine: An Inclusive Crossover Discipline.

Authors:  James Macy; Tamas L Horvath
Journal:  Yale J Biol Med       Date:  2017-09-25

5.  Closing the knowledge-action gap in conservation with open science.

Authors:  Dominique G Roche; Rose E O'Dea; Kecia A Kerr; Trina Rytwinski; Richard Schuster; Vivian M Nguyen; Nathan Young; Joseph R Bennett; Steven J Cooke
Journal:  Conserv Biol       Date:  2021-11-29       Impact factor: 7.563

6.  A survey of researchers' code sharing and code reuse practices, and assessment of interactive notebook prototypes.

Authors:  Lauren Cadwallader; Iain Hrynaszkiewicz
Journal:  PeerJ       Date:  2022-08-22       Impact factor: 3.061

Review 7.  The role of fire in UK peatland and moorland management: the need for informed, unbiased debate.

Authors:  G Matt Davies; Nicholas Kettridge; Cathelijne R Stoof; Alan Gray; Davide Ascoli; Paulo M Fernandes; Rob Marrs; Katherine A Allen; Stefan H Doerr; Gareth D Clay; Julia McMorrow; Vigdis Vandvik
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2016-06-05       Impact factor: 6.237

8.  Preferred reporting items for systematic reviews and meta-analyses in ecology and evolutionary biology: a PRISMA extension.

Authors:  Rose E O'Dea; Malgorzata Lagisz; Michael D Jennions; Julia Koricheva; Daniel W A Noble; Timothy H Parker; Jessica Gurevitch; Matthew J Page; Gavin Stewart; David Moher; Shinichi Nakagawa
Journal:  Biol Rev Camb Philos Soc       Date:  2021-05-07
  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.