| Literature DB >> 32869058 |
Hun S Choi1, William D Marslen-Wilson1, Bingjiang Lyu1, Billi Randall1, Lorraine K Tyler1.
Abstract
Communication through spoken language is a central human capacity, involving a wide range of complex computations that incrementally interpret each word into meaningful sentences. However, surprisingly little is known about the spatiotemporal properties of the complex neurobiological systems that support these dynamic predictive and integrative computations. Here, we focus on prediction, a core incremental processing operation guiding the interpretation of each upcoming word with respect to its preceding context. To investigate the neurobiological basis of how semantic constraints change and evolve as each word in a sentence accumulates over time, in a spoken sentence comprehension study, we analyzed the multivariate patterns of neural activity recorded by source-localized electro/magnetoencephalography (EMEG), using computational models capturing semantic constraints derived from the prior context on each upcoming word. Our results provide insights into predictive operations subserved by different regions within a bi-hemispheric system, which over time generate, refine, and evaluate constraints on each word as it is heard.Entities:
Keywords: Bayesian language modeling; electro/magnetoencephalography; incremental prediction; representational similarity analysis; semantics
Mesh:
Year: 2021 PMID: 32869058 PMCID: PMC7727355 DOI: 10.1093/cercor/bhaa222
Source DB: PubMed Journal: Cereb Cortex ISSN: 1047-3211 Impact factor: 5.357