| Literature DB >> 32831168 |
Anne Keitel1,2, Joachim Gross2,3, Christoph Kayser4.
Abstract
Visual speech carried by lip movements is an integral part of communication. Yet, it remains unclear in how far visual and acoustic speech comprehension are mediated by the same brain regions. Using multivariate classification of full-brain MEG data, we first probed where the brain represents acoustically and visually conveyed word identities. We then tested where these sensory-driven representations are predictive of participants' trial-wise comprehension. The comprehension-relevant representations of auditory and visual speech converged only in anterior angular and inferior frontal regions and were spatially dissociated from those representations that best reflected the sensory-driven word identity. These results provide a neural explanation for the behavioural dissociation of acoustic and visual speech comprehension and suggest that cerebral representations encoding word identities may be more modality-specific than often upheld.Entities:
Keywords: MEG; auditory pathways; computational biology; human; lip reading; neuroscience; speech decoding; systems biology; visual speech; word classification
Year: 2020 PMID: 32831168 PMCID: PMC7470824 DOI: 10.7554/eLife.56972
Source DB: PubMed Journal: Elife ISSN: 2050-084X Impact factor: 8.140