| Literature DB >> 19396166 |
Nikolaus Kriegeskorte1, W Kyle Simmons, Patrick S F Bellgowan, Chris I Baker.
Abstract
A neuroscientific experiment typically generates a large amount of data, of which only a small fraction is analyzed in detail and presented in a publication. However, selection among noisy measurements can render circular an otherwise appropriate analysis and invalidate results. Here we argue that systems neuroscience needs to adjust some widespread practices to avoid the circularity that can arise from selection. In particular, 'double dipping', the use of the same dataset for selection and selective analysis, will give distorted descriptive statistics and invalid statistical inference whenever the results statistics are not inherently independent of the selection criteria under the null hypothesis. To demonstrate the problem, we apply widely used analyses to noise data known to not contain the experimental effects in question. Spurious effects can appear in the context of both univariate activation analysis and multivariate pattern-information analysis. We suggest a policy for avoiding circularity.Entities:
Mesh:
Year: 2009 PMID: 19396166 PMCID: PMC2841687 DOI: 10.1038/nn.2303
Source DB: PubMed Journal: Nat Neurosci ISSN: 1097-6256 Impact factor: 24.884