| Literature DB >> 31379527 |
M F Assaneo1, J M Rimmele2, J Orpella1, P Ripollés1, R de Diego-Balaguer3,4,5,6, D Poeppel1,2.
Abstract
The lateralization of neuronal processing underpinning hearing, speech, language, and music is widely studied, vigorously debated, and still not understood in a satisfactory manner. One set of hypotheses focuses on the temporal structure of perceptual experience and links auditory cortex asymmetries to underlying differences in neural populations with differential temporal sensitivity (e.g., ideas advanced by Zatorre et al. (2002) and Poeppel (2003). The Asymmetric Sampling in Time theory (AST) (Poeppel, 2003), builds on cytoarchitectonic differences between auditory cortices and predicts that modulation frequencies within the range of, roughly, the syllable rate, are more accurately tracked by the right hemisphere. To date, this conjecture is reasonably well supported, since - while there is some heterogeneity in the reported findings - the predicted asymmetrical entrainment has been observed in various experimental protocols. Here, we show that under specific processing demands, the rightward dominance disappears. We propose an enriched and modified version of the asymmetric sampling hypothesis in the context of speech. Recent work (Rimmele et al., 2018b) proposes two different mechanisms to underlie the auditory tracking of the speech envelope: one derived from the intrinsic oscillatory properties of auditory regions; the other induced by top-down signals coming from other non-auditory regions of the brain. We propose that under non-speech listening conditions, the intrinsic auditory mechanism dominates and thus, in line with AST, entrainment is rightward lateralized, as is widely observed. However, (i) depending on individual brain structural/functional differences, and/or (ii) in the context of specific speech listening conditions, the relative weight of the top-down mechanism can increase. In this scenario, the typically observed auditory sampling asymmetry (and its rightward dominance) diminishes or vanishes.Entities:
Keywords: MEG (magnetoencephalography); asymmetrical sampling; brain to stimulus synchronization; speech envelope tracking; speech perception
Year: 2019 PMID: 31379527 PMCID: PMC6650591 DOI: 10.3389/fnint.2019.00028
Source DB: PubMed Journal: Front Integr Neurosci ISSN: 1662-5145
FIGURE 1Rightward dominance is affected by speech rate during a syllable perception task. (A) PLV between auditory cortices and speech envelope, increment from resting state. Mean PLV around the syllable rate of each condition (syllable rate ± 0.5 Hz). Left auditory synchronization shows no change between conditions (Kruskal-Wallis test: χ2(4) = 5.6, two-sided p = 0.23). However, the right auditory cortex does (Kruskal-Wallis test: χ2(4) = 12.45, two-sided p = 0.014) (Adapted from Assaneo and Poeppel, 2018). (B) Auditory coupling asymmetry for the different syllable rate conditions: the degree of asymmetry is modulated by the syllable rate (Kruskal-Wallis test: χ2(4) = 13.63, two-sided p = 0.008). The asymmetry is significantly above zero only for 4.5 and 5.5 syllables per second. * Stands for two-sided p < 0.05 (Wilcoxon Signed-Rank test, FDR corrected). Dots: individual participants, the scattering in the X-axis is for visualization purposes. Black lines: mean across participants. Shaded region: SD. N = 17.
FIGURE 2The degree of asymmetry correlates with the strength of auditory-frontal connectivity during a syllable perception task. Synchronization (PLV) between auditory cortex activity and the perceived speech envelope in left and right hemispheres: (A) all subjects pooled and (B) low synchronizers in the upper (blue) panel versus high synchronizers in the lower panel. (C) Auditory coupling asymmetry: comparison between groups. [B,C reanalyzed and replotted from Assaneo et al. (2019)] (D) Connectivity at 4.5 Hz (wPLI) between early auditory cortex (BA 41/41, TE 1.0 and TE 1.2 in blue) and frontal regions (caudal BA 45 and IFS in red) correlates with the degree of auditory-to-stimulus coupling asymmetry (Spearman correlation, two-sided p < 0.05, FDR-corrected). Scatter plots of the correlation between auditory-to-stimulus asymmetry and the wPLI between left auditory cortex and areas highlighted in red: (E) inferior frontal sulcus (IFS; N = 37) and (F) caudal BA 45 (N = 36). Orange/blue corresponds to high/low synchronizers, respectively. ∗∗ Stands for two-sided p < 0.005 (Wilcoxon signed-rank test), * for two-sided p < 0.05 (Mann-Whitney-Wilcoxon test). Dots: individual participants; in panels A–C the scattering in the X-axis is for visualization purposes. Black lines: mean across participants. Shaded region in panels A–C: SD. Red line: linear regression.
FIGURE 3Semantic access reverses the classical rightward dominance for the envelope tracking. (A) Auditory coupling asymmetry index: comparison between conditions (N = 17; two-sided p = 0.035, paired Wilcoxon Signed-Rank test). (B) Coherence between Heschl’s gyrus (right and left hemispheres) activity and the auditory stimulus envelope, in a frequency band around 4 Hz. Left panel: Semantic Condition (N = 17; two-sided p = 0.010, paired Wilcoxon Signed Rank test). Right panel: Non-Semantic Condition (N = 17; two-sided p = 0.15, paired Wilcoxon Signed-Rank test). (C) Scatterplot of the auditory coupling asymmetry as a function of the connectivity between left STG- IFG, in a frequency band around 4 Hz. Left panel: Semantic Condition (N = 16; Spearman correlation coefficient r = 0.4, two-sided p = 0.12). Right panel: Non-Semantic Condition (N = 17; Spearman correlation coefficient r = –0.46, two-sided p = 0.063). In all panels: Pink/green correspond to Semantic/Non-Semantic (German words/Turkish pseudo-words) respectively, and dots: individual participants. In panels A,B: the shaded region represents SD, the black line the mean across participants and the dots scattering in the X-axis is for visualization purposes. In panel C: the black line represents the linear regression.
Different origins for the observed auditory-to-envelope synchronization.
| (i) | Paralinguistic/Phonetic | Right | Intrinsic auditory |
| (ii) | Phonological/Semantic | Left | Externally driven |