Literature DB >> 22266262

Speech comprehension aided by multiple modalities: behavioural and neural interactions.

Carolyn McGettigan1, Andrew Faulkner, Irene Altarelli, Jonas Obleser, Harriet Baverstock, Sophie K Scott.   

Abstract

Speech comprehension is a complex human skill, the performance of which requires the perceiver to combine information from several sources - e.g. voice, face, gesture, linguistic context - to achieve an intelligible and interpretable percept. We describe a functional imaging investigation of how auditory, visual and linguistic information interact to facilitate comprehension. Our specific aims were to investigate the neural responses to these different information sources, alone and in interaction, and further to use behavioural speech comprehension scores to address sites of intelligibility-related activation in multifactorial speech comprehension. In fMRI, participants passively watched videos of spoken sentences, in which we varied Auditory Clarity (with noise-vocoding), Visual Clarity (with Gaussian blurring) and Linguistic Predictability. Main effects of enhanced signal with increased auditory and visual clarity were observed in overlapping regions of posterior STS. Two-way interactions of the factors (auditory × visual, auditory × predictability) in the neural data were observed outside temporal cortex, where positive signal change in response to clearer facial information and greater semantic predictability was greatest at intermediate levels of auditory clarity. Overall changes in stimulus intelligibility by condition (as determined using an independent behavioural experiment) were reflected in the neural data by increased activation predominantly in bilateral dorsolateral temporal cortex, as well as inferior frontal cortex and left fusiform gyrus. Specific investigation of intelligibility changes at intermediate auditory clarity revealed a set of regions, including posterior STS and fusiform gyrus, showing enhanced responses to both visual and linguistic information. Finally, an individual differences analysis showed that greater comprehension performance in the scanning participants (measured in a post-scan behavioural test) were associated with increased activation in left inferior frontal gyrus and left posterior STS. The current multimodal speech comprehension paradigm demonstrates recruitment of a wide comprehension network in the brain, in which posterior STS and fusiform gyrus form sites for convergence of auditory, visual and linguistic information, while left-dominant sites in temporal and frontal cortex support successful comprehension.
Copyright © 2012 Elsevier Ltd. All rights reserved.

Entities:  

Mesh:

Substances:

Year:  2012        PMID: 22266262      PMCID: PMC4050300          DOI: 10.1016/j.neuropsychologia.2012.01.010

Source DB:  PubMed          Journal:  Neuropsychologia        ISSN: 0028-3932            Impact factor:   3.139


  107 in total

1.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex.

Authors:  G A Calvert; R Campbell; M J Brammer
Journal:  Curr Biol       Date:  2000-06-01       Impact factor: 10.834

2.  Polysensory interactions along lateral temporal regions evoked by audiovisual speech.

Authors:  Tarra M Wright; Kevin A Pelphrey; Truett Allison; Martin J McKeown; Gregory McCarthy
Journal:  Cereb Cortex       Date:  2003-10       Impact factor: 5.357

3.  The role of visual speech cues in reducing energetic and informational masking.

Authors:  Karen S Helfer; Richard L Freyman
Journal:  J Acoust Soc Am       Date:  2005-02       Impact factor: 1.840

4.  Recruitment of fusiform face area associated with listening to degraded speech sounds in auditory-visual speech perception: a PET study.

Authors:  Tetsuaki Kawase; Keiichiro Yamaguchi; Takenori Ogawa; Ken-Ichi Suzuki; Maki Suzuki; Masatoshi Itoh; Toshimitsu Kobayashi; Toshikatsu Fujii
Journal:  Neurosci Lett       Date:  2005-04-18       Impact factor: 3.046

5.  Statistical criteria in FMRI studies of multisensory integration.

Authors:  Michael S Beauchamp
Journal:  Neuroinformatics       Date:  2005

6.  Perceptual fusion and stimulus coincidence in the cross-modal integration of speech.

Authors:  Lee M Miller; Mark D'Esposito
Journal:  J Neurosci       Date:  2005-06-22       Impact factor: 6.167

7.  Neural integration of iconic and unrelated coverbal gestures: a functional MRI study.

Authors:  Antonia Green; Benjamin Straube; Susanne Weis; Andreas Jansen; Klaus Willmes; Kerstin Konrad; Tilo Kircher
Journal:  Hum Brain Mapp       Date:  2009-10       Impact factor: 5.038

8.  Acoustic and linguistic factors in the perception of bandpass-filtered speech.

Authors:  G S Stickney; P F Assmann
Journal:  J Acoust Soc Am       Date:  2001-03       Impact factor: 1.840

9.  Converging language streams in the human temporal lobe.

Authors:  Galina Spitsyna; Jane E Warren; Sophie K Scott; Federico E Turkheimer; Richard J S Wise
Journal:  J Neurosci       Date:  2006-07-12       Impact factor: 6.167

10.  On-line plasticity in spoken sentence comprehension: Adapting to time-compressed speech.

Authors:  Patti Adank; Joseph T Devlin
Journal:  Neuroimage       Date:  2009-07-24       Impact factor: 6.556

View more
  34 in total

1.  Multivariate activation and connectivity patterns discriminate speech intelligibility in Wernicke's, Broca's, and Geschwind's areas.

Authors:  Daniel A Abrams; Srikanth Ryali; Tianwen Chen; Evan Balaban; Daniel J Levitin; Vinod Menon
Journal:  Cereb Cortex       Date:  2012-06-12       Impact factor: 5.357

Review 2.  Do temporal processes underlie left hemisphere dominance in speech perception?

Authors:  Sophie K Scott; Carolyn McGettigan
Journal:  Brain Lang       Date:  2013-10       Impact factor: 2.381

3.  An fMRI study investigating effects of conceptually related sentences on the perception of degraded speech.

Authors:  Sara Guediche; Megan Reilly; Carolina Santiago; Patryk Laurent; Sheila E Blumstein
Journal:  Cortex       Date:  2016-03-25       Impact factor: 4.027

4.  Exploring the roles of spectral detail and intonation contour in speech intelligibility: an FMRI study.

Authors:  Jeong S Kyong; Sophie K Scott; Stuart Rosen; Timothy B Howe; Zarinah K Agnew; Carolyn McGettigan
Journal:  J Cogn Neurosci       Date:  2014-02-25       Impact factor: 3.225

5.  Rethinking the McGurk effect as a perceptual illusion.

Authors:  Laura M Getz; Joseph C Toscano
Journal:  Atten Percept Psychophys       Date:  2021-04-21       Impact factor: 2.199

6.  Musical chords and emotion: major and minor triads are processed for emotion.

Authors:  David Radford Bakker; Frances Heritage Martin
Journal:  Cogn Affect Behav Neurosci       Date:  2015-03       Impact factor: 3.282

Review 7.  Prediction and constraint in audiovisual speech perception.

Authors:  Jonathan E Peelle; Mitchell S Sommers
Journal:  Cortex       Date:  2015-03-20       Impact factor: 4.027

8.  Brain networks engaged in audiovisual integration during speech perception revealed by persistent homology-based network filtration.

Authors:  Heejung Kim; Jarang Hahm; Hyekyoung Lee; Eunjoo Kang; Hyejin Kang; Dong Soo Lee
Journal:  Brain Connect       Date:  2015-03-02

9.  Visual input enhances selective speech envelope tracking in auditory cortex at a "cocktail party".

Authors:  Elana Zion Golumbic; Gregory B Cogan; Charles E Schroeder; David Poeppel
Journal:  J Neurosci       Date:  2013-01-23       Impact factor: 6.167

10.  Can you hear me yet? An intracranial investigation of speech and non-speech audiovisual interactions in human cortex.

Authors:  Ariane E Rhone; Kirill V Nourski; Hiroyuki Oya; Hiroto Kawasaki; Matthew A Howard; Bob McMurray
Journal:  Lang Cogn Neurosci       Date:  2015-10-19       Impact factor: 2.331

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.