Literature DB >> 34815317

Increased Connectivity among Sensory and Motor Regions during Visual and Audiovisual Speech Perception.

Jonathan E Peelle1, Brent Spehar2, Michael S Jones2, Sarah McConkey2, Joel Myerson3, Sandra Hale3, Mitchell S Sommers3, Nancy Tye-Murray2.   

Abstract

In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is degraded. Here, we used fMRI to monitor brain activity while adult humans (n = 60) were presented with visual-only, auditory-only, and audiovisual words. The audiovisual words were presented in quiet and in several signal-to-noise ratios. As expected, audiovisual speech perception recruited both auditory and visual cortex, with some evidence for increased recruitment of premotor cortex in some conditions (including in substantial background noise). We then investigated neural connectivity using psychophysiological interaction analysis with seed regions in both primary auditory cortex and primary visual cortex. Connectivity between auditory and visual cortices was stronger in audiovisual conditions than in unimodal conditions, including a wide network of regions in posterior temporal cortex and prefrontal cortex. In addition to whole-brain analyses, we also conducted a region-of-interest analysis on the left posterior superior temporal sulcus (pSTS), implicated in many previous studies of audiovisual speech perception. We found evidence for both activity and effective connectivity in pSTS for visual-only and audiovisual speech, although these were not significant in whole-brain analyses. Together, our results suggest a prominent role for cross-region synchronization in understanding both visual-only and audiovisual speech that complements activity in integrative brain regions like pSTS.SIGNIFICANCE STATEMENT In everyday conversation, we usually process the talker's face as well as the sound of the talker's voice. Access to visual speech information is particularly useful when the auditory signal is hard to understand (e.g., background noise). Prior work has suggested that specialized regions of the brain may play a critical role in integrating information from visual and auditory speech. Here, we show a complementary mechanism relying on synchronized brain activity among sensory and motor regions may also play a critical role. These findings encourage reconceptualizing audiovisual integration in the context of coordinated network activity.
Copyright © 2022 the authors.

Entities:  

Keywords:  audiovisual integration; language; lipreading; speech; speechreading

Mesh:

Year:  2021        PMID: 34815317      PMCID: PMC8802926          DOI: 10.1523/JNEUROSCI.0114-21.2021

Source DB:  PubMed          Journal:  J Neurosci        ISSN: 0270-6474            Impact factor:   6.709


  52 in total

1.  Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the MNI MRI single-subject brain.

Authors:  N Tzourio-Mazoyer; B Landeau; D Papathanassiou; F Crivello; O Etard; N Delcroix; B Mazoyer; M Joliot
Journal:  Neuroimage       Date:  2002-01       Impact factor: 6.556

2.  Unraveling multisensory integration: patchy organization within human STS multisensory cortex.

Authors:  Michael S Beauchamp; Brenna D Argall; Jerzy Bodurka; Jeff H Duyn; Alex Martin
Journal:  Nat Neurosci       Date:  2004-10-10       Impact factor: 24.884

Review 3.  Some behavioral and neurobiological constraints on theories of audiovisual speech integration: a review and suggestions for new directions.

Authors:  Nicholas Altieri; David B Pisoni; James T Townsend
Journal:  Seeing Perceiving       Date:  2011-09-29

4.  Hearing lips and seeing voices.

Authors:  H McGurk; J MacDonald
Journal:  Nature       Date:  1976 Dec 23-30       Impact factor: 49.962

5.  Judging the relative duration of multimodal short empty time intervals.

Authors:  S Grondin; R Rousseau
Journal:  Percept Psychophys       Date:  1991-03

6.  Two cortical mechanisms support the integration of visual and auditory speech: a hypothesis and preliminary data.

Authors:  Kayoko Okada; Gregory Hickok
Journal:  Neurosci Lett       Date:  2009-01-29       Impact factor: 3.046

7.  Audiovisual integration in human superior temporal sulcus: Inverse effectiveness and the neural processing of speech and object recognition.

Authors:  Ryan A Stevenson; Thomas W James
Journal:  Neuroimage       Date:  2008-10-10       Impact factor: 6.556

8.  Improved auditory cortex imaging using clustered volume acquisitions.

Authors:  W B Edmister; T M Talavage; P J Ledden; R M Weisskoff
Journal:  Hum Brain Mapp       Date:  1999       Impact factor: 5.038

9.  The natural statistics of audiovisual speech.

Authors:  Chandramouli Chandrasekaran; Andrea Trubanova; Sébastien Stillittano; Alice Caplier; Asif A Ghazanfar
Journal:  PLoS Comput Biol       Date:  2009-07-17       Impact factor: 4.475

10.  The effect of speech distortion on the excitability of articulatory motor cortex.

Authors:  Helen E Nuttall; Daniel Kennedy-Higgins; John Hogan; Joseph T Devlin; Patti Adank
Journal:  Neuroimage       Date:  2015-12-28       Impact factor: 6.556

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.