Literature DB >> 19810798

Improved classification images with sparse priors in a smooth basis.

Patrick J Mineault1, Simon Barthelmé, Christopher C Pack.   

Abstract

Classification images provide compelling insight into the strategies used by observers in psychophysical tasks. However, because of the high-dimensional nature of classification images and the limited quantity of trials that can practically be performed, classification images are often too noisy to be useful unless denoising strategies are adopted. Here we propose a method of estimating classification images by the use of sparse priors in smooth bases and generalized linear models (GLMs). Sparse priors in a smooth basis are used to impose assumptions about the simplicity of observers' internal templates, and they naturally generalize commonly used methods such as smoothing and thresholding. The use of GLMs in this context provides a number of advantages over classic estimation techniques, including the possibility of using stimuli with non-Gaussian statistics, such as natural textures. Using simulations, we show that our method recovers classification images that are typically less noisy and more accurate for a smaller number of trials than previously published techniques. Finally, we have verified the efficiency and accuracy of our approach with psychophysical data from a human observer.

Entities:  

Mesh:

Year:  2009        PMID: 19810798     DOI: 10.1167/9.10.17

Source DB:  PubMed          Journal:  J Vis        ISSN: 1534-7362            Impact factor:   2.240


  10 in total

1.  Bayesian inference for generalized linear models for spiking neurons.

Authors:  Sebastian Gerwinn; Jakob H Macke; Matthias Bethge
Journal:  Front Comput Neurosci       Date:  2010-05-28       Impact factor: 2.380

2.  Dynamic sensory cues shape song structure in Drosophila.

Authors:  Philip Coen; Jan Clemens; Andrew J Weinstein; Diego A Pacheco; Yi Deng; Mala Murthy
Journal:  Nature       Date:  2014-03-05       Impact factor: 49.962

3.  Sensorimotor Transformations Underlying Variability in Song Intensity during Drosophila Courtship.

Authors:  Philip Coen; Marjorie Xie; Jan Clemens; Mala Murthy
Journal:  Neuron       Date:  2016-02-03       Impact factor: 17.173

4.  Receptive field inference with localized priors.

Authors:  Mijung Park; Jonathan W Pillow
Journal:  PLoS Comput Biol       Date:  2011-10-27       Impact factor: 4.475

5.  Acoustic duetting in Drosophila virilis relies on the integration of auditory and tactile signals.

Authors:  Kelly M LaRue; Jan Clemens; Gordon J Berman; Mala Murthy
Journal:  Elife       Date:  2015-06-05       Impact factor: 8.140

6.  How musical expertise shapes speech perception: evidence from auditory classification images.

Authors:  Léo Varnet; Tianyun Wang; Chloe Peter; Fanny Meunier; Michel Hoen
Journal:  Sci Rep       Date:  2015-09-24       Impact factor: 4.379

7.  A psychophysical imaging method evidencing auditory cue extraction during speech perception: a group analysis of auditory classification images.

Authors:  Léo Varnet; Kenneth Knoblauch; Willy Serniclaes; Fanny Meunier; Michel Hoen
Journal:  PLoS One       Date:  2015-03-17       Impact factor: 3.240

8.  Direct Viewing of Dyslexics' Compensatory Strategies in Speech in Noise Using Auditory Classification Images.

Authors:  Léo Varnet; Fanny Meunier; Gwendoline Trollé; Michel Hoen
Journal:  PLoS One       Date:  2016-04-21       Impact factor: 3.240

9.  Discovery of a New Song Mode in Drosophila Reveals Hidden Structure in the Sensory and Neural Drivers of Behavior.

Authors:  Jan Clemens; Philip Coen; Frederic A Roemschied; Talmo D Pereira; David Mazumder; Diego E Aldarondo; Diego A Pacheco; Mala Murthy
Journal:  Curr Biol       Date:  2018-07-26       Impact factor: 10.834

10.  Using auditory classification images for the identification of fine acoustic cues used in speech perception.

Authors:  Léo Varnet; Kenneth Knoblauch; Fanny Meunier; Michel Hoen
Journal:  Front Hum Neurosci       Date:  2013-12-16       Impact factor: 3.169

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.