| Literature DB >> 29911971 |
Yosef Singer1, Yayoi Teramoto1, Ben Db Willmore1, Jan Wh Schnupp2, Andrew J King1, Nicol S Harper1.
Abstract
Neurons in sensory cortex are tuned to diverse features in natural scenes. But what determines which features neurons become selective to? Here we explore the idea that neuronal selectivity is optimized to represent features in the recent sensory past that best predict immediate future inputs. We tested this hypothesis using simple feedforward neural networks, which were trained to predict the next few moments of video or audio in clips of natural scenes. The networks developed receptive fields that closely matched those of real cortical neurons in different mammalian species, including the oriented spatial tuning of primary visual cortex, the frequency selectivity of primary auditory cortex and, most notably, their temporal tuning properties. Furthermore, the better a network predicted future inputs the more closely its receptive fields resembled those in the brain. This suggests that sensory processing is optimized to extract those features with the most capacity to predict future input.Entities:
Keywords: auditory; cortex; ferret; model; neuroscience; normative; prediction
Mesh:
Year: 2018 PMID: 29911971 PMCID: PMC6108826 DOI: 10.7554/eLife.31557
Source DB: PubMed Journal: Elife ISSN: 2050-084X Impact factor: 8.713