| Literature DB >> 33192386 |
Maëva Michon1,2, Gonzalo Boncompte3, Vladimir López4.
Abstract
The human brain generates predictions about future events. During face-to-face conversations, visemic information is used to predict upcoming auditory input. Recent studies suggest that the speech motor system plays a role in these cross-modal predictions, however, usually only audio-visual paradigms are employed. Here we tested whether speech sounds can be predicted on the basis of visemic information only, and to what extent interfering with orofacial articulatory effectors can affect these predictions. We registered EEG and employed N400 as an index of such predictions. Our results show that N400's amplitude was strongly modulated by visemic salience, coherent with cross-modal speech predictions. Additionally, N400 ceased to be evoked when syllables' visemes were presented backwards, suggesting that predictions occur only when the observed viseme matched an existing articuleme in the observer's speech motor system (i.e., the articulatory neural sequence required to produce a particular phoneme/viseme). Importantly, we found that interfering with the motor articulatory system strongly disrupted cross-modal predictions. We also observed a late P1000 that was evoked only for syllable-related visual stimuli, but whose amplitude was not modulated by interfering with the motor system. The present study provides further evidence of the importance of the speech production system for speech sounds predictions based on visemic information at the pre-lexical level. The implications of these results are discussed in the context of a hypothesized trimodal repertoire for speech, in which speech perception is conceived as a highly interactive process that involves not only your ears but also your eyes, lips and tongue.Entities:
Keywords: ERPs; articuleme; cross-modal prediction; orofacial movements; place of articulation; speech motor system; viseme
Year: 2020 PMID: 33192386 PMCID: PMC7653187 DOI: 10.3389/fnhum.2020.538619
Source DB: PubMed Journal: Front Hum Neurosci ISSN: 1662-5161 Impact factor: 3.169
Figure 1Experimental procedure. (A) Timeline description of trials. (B) Illustration of the places of articulation (PoA) of the syllables used. (C) Depiction of the position of the effector depressor in participants' mouths.
Figure 2Effects of experimental manipulation on N400 component. (A) Effect of the different PoA of forward syllables in electrode Fz. (B) Topographical maps of the N400 according the PoA at the peak amplitude latency (500 ms). (C) Effect of effector depressor on the perception of forward bilabial syllables in electrode Fz. (D) Effect of forward versus backward syllables on the perception of bilabial syllables without effector depressor in electrode Fz.
Simple main effects of effector depressor on bilabial syllables.
| Forward | F3 | 23.582 | 1 | 23.582 | 4.233 | |
| Fz | 35.970 | 1 | 35.970 | 5.335 | ||
| F4 | 31.175 | 1 | 31.175 | 5.808 | ||
| Backward | F3 | 2.230 | 1 | 2.230 | 0.548 | 0.466 |
| Fz | 2.195 | 1 | 2.195 | 0.405 | 0.530 | |
| F4 | 4.277 | 1 | 4.277 | 1.043 | 3.316 | |
p < 0.05.
Simple main effects of forward vs. backward displaying of bilabial syllables.
| Experiment 1 | F3 | 34.864 | 1 | 34.864 | 10.480 | |
| Fz | 38.014 | 1 | 38.014 | 9.140 | ||
| F4 | 39.363 | 1 | 39.363 | 8.393 | ||
| Experiment 2 | F3 | 0.198 | 1 | 0.198 | 0.110 | 0.743 |
| Fz | 1.726 | 1 | 1.726 | 0.640 | 0.431 | |
| F4 | 1.898 | 1 | 1.898 | 0.607 | 0.443 | |
p < 0.01.
Figure 3Effect of conditions on P1000 component in experiment 1. (A) The gray bar represents the time window in which significant differences in amplitude were found. (B) Forward and backward syllables (top maps) and non-syllabic conditions (bottom maps) showed different topographic representations.