| Literature DB >> 28044015 |
L S Petro1, A T Paton1, L Muckli2.
Abstract
Early visual cortex receives non-feedforward input from lateral and top-down connections (Muckli & Petro 2013 Curr. Opin. Neurobiol. 23, 195-201. (doi:10.1016/j.conb.2013.01.020)), including long-range projections from auditory areas. Early visual cortex can code for high-level auditory information, with neural patterns representing natural sound stimulation (Vetter et al. 2014 Curr. Biol. 24, 1256-1262. (doi:10.1016/j.cub.2014.04.020)). We discuss a number of questions arising from these findings. What is the adaptive function of bimodal representations in visual cortex? What type of information projects from auditory to visual cortex? What are the anatomical constraints of auditory information in V1, for example, periphery versus fovea, superficial versus deep cortical layers? Is there a putative neural mechanism we can infer from human neuroimaging data and recent theoretical accounts of cortex? We also present data showing we can read out high-level auditory information from the activation patterns of early visual cortex even when visual cortex receives simple visual stimulation, suggesting independent channels for visual and auditory signals in V1. We speculate which cellular mechanisms allow V1 to be contextually modulated by auditory input to facilitate perception, cognition and behaviour. Beyond cortical feedback that facilitates perception, we argue that there is also feedback serving counterfactual processing during imagery, dreaming and mind wandering, which is not relevant for immediate perception but for behaviour and cognition over a longer time frame.This article is part of the themed issue 'Auditory and visual scene analysis'.Entities:
Keywords: auditory modulation; cortical feedback; primary visual cortex
Mesh:
Year: 2017 PMID: 28044015 PMCID: PMC5206272 DOI: 10.1098/rstb.2016.0104
Source DB: PubMed Journal: Philos Trans R Soc Lond B Biol Sci ISSN: 0962-8436 Impact factor: 6.237
Figure 1.(a) Multisensory areas respond to audio or visual signals individually, or to a spatio-temporal overlap in audio and visual signals. (b) In primary visual cortex, feedforward geniculate inputs activate classical receptive fields, whereas auditory signals activate the non-classical receptive field of V1 neurons, carried by cortical feedback. Top-down auditory signals to V1 may originate directly from auditory cortex, or indirectly via extrastriate cortex or multisensory areas. (c) V1 responses to auditory stimulation have been investigated at different spatial and temporal resolutions (see text). It is possible that feedforward and feedback inputs arrive at individual cortical neurons [4], which can be studied in isolation using appropriate paradigms (such as visual occlusion) that mask feedforward input.
Figure 2.Classification performance for decoding sounds in eccentricity mapped V1, V2 and V3. The top row reports group classification accuracy from ([3], with permission) in which subjects were blindfolded. The bottom row reports group classification accuracy from a replication of this study, but with an eyes-open fixation task. Surface maps represent significant t-values for sound stimulation only.