Literature DB >> 12590843

Reading speech from still and moving faces: the neural substrates of visible speech.

Gemma A Calvert1, Ruth Campbell.   

Abstract

Speech is perceived both by ear and by eye. Unlike heard speech, some seen speech gestures can be captured in stilled image sequences. Previous studies have shown that in hearing people, natural time-varying silent seen speech can access the auditory cortex (left superior temporal regions). Using functional magnetic resonance imaging (fMRI), the present study explored the extent to which this circuitry was activated when seen speech was deprived of its time-varying characteristics. In the scanner, hearing participants were instructed to look for a prespecified visible speech target sequence ("voo" or "ahv") among other monosyllables. In one condition, the image sequence comprised a series of stilled key frames showing apical gestures (e.g., separate frames for "v" and "oo" [from the target] or "ee" and "m" [i.e., from nontarget syllables]). In the other condition, natural speech movement of the same overall segment duration was seen. In contrast to a baseline condition in which the letter "V" was superimposed on a resting face, stilled speech face images generated activation in posterior cortical regions associated with the perception of biological movement, despite the lack of apparent movement in the speech image sequence. Activation was also detected in traditional speech-processing regions including the left inferior frontal (Broca's) area, left superior temporal sulcus (STS), and left supramarginal gyrus (the dorsal aspect of Wernicke's area). Stilled speech sequences also generated activation in the ventral premotor cortex and anterior inferior parietal sulcus bilaterally. Moving faces generated significantly greater cortical activation than stilled face sequences, and in similar regions. However, a number of differences between stilled and moving speech were also observed. In the visual cortex, stilled faces generated relatively more activation in primary visual regions (V1/V2), while visual movement areas (V5/MT+) were activated to a greater extent by moving faces. Cortical regions activated more by naturally moving speaking faces included the auditory cortex (Brodmann's Areas 41/42; lateral parts of Heschl's gyrus) and the left STS and inferior frontal gyrus. Seen speech with normal time-varying characteristics appears to have preferential access to "purely" auditory processing regions specialized for language, possibly via acquired dynamic audiovisual integration mechanisms in STS. When seen speech lacks natural time-varying characteristics, access to speech-processing systems in the left temporal lobe may be achieved predominantly via action-based speech representations, realized in the ventral premotor cortex.

Entities:  

Mesh:

Year:  2003        PMID: 12590843     DOI: 10.1162/089892903321107828

Source DB:  PubMed          Journal:  J Cogn Neurosci        ISSN: 0898-929X            Impact factor:   3.225


  113 in total

1.  Developmental Shifts in Detection and Attention for Auditory, Visual, and Audiovisual Speech.

Authors:  Susan Jerger; Markus F Damian; Cassandra Karl; Hervé Abdi
Journal:  J Speech Lang Hear Res       Date:  2018-12-10       Impact factor: 2.297

2.  Bimodal speech: early suppressive visual effects in human auditory cortex.

Authors:  Julien Besle; Alexandra Fort; Claude Delpuech; Marie-Hélène Giard
Journal:  Eur J Neurosci       Date:  2004-10       Impact factor: 3.386

Review 3.  Some behavioral and neurobiological constraints on theories of audiovisual speech integration: a review and suggestions for new directions.

Authors:  Nicholas Altieri; David B Pisoni; James T Townsend
Journal:  Seeing Perceiving       Date:  2011-09-29

4.  The perception of visible speech: estimation of speech rate and detection of time reversals.

Authors:  Paolo Viviani; Francesca Figliozzi; Francesco Lacquaniti
Journal:  Exp Brain Res       Date:  2011-10-11       Impact factor: 1.972

5.  The functional neuroanatomy of language.

Authors:  Gregory Hickok
Journal:  Phys Life Rev       Date:  2009-09       Impact factor: 11.025

6.  Neural correlates of interindividual differences in children's audiovisual speech perception.

Authors:  Audrey R Nath; Eswen E Fava; Michael S Beauchamp
Journal:  J Neurosci       Date:  2011-09-28       Impact factor: 6.167

Review 7.  Social cognition and the cerebellum: A meta-analytic connectivity analysis.

Authors:  Frank Van Overwalle; Tine D'aes; Peter Mariën
Journal:  Hum Brain Mapp       Date:  2015-09-30       Impact factor: 5.038

Review 8.  Odor/taste integration and the perception of flavor.

Authors:  Dana M Small; John Prescott
Journal:  Exp Brain Res       Date:  2005-07-19       Impact factor: 1.972

9.  A continuous semantic space describes the representation of thousands of object and action categories across the human brain.

Authors:  Alexander G Huth; Shinji Nishimoto; An T Vu; Jack L Gallant
Journal:  Neuron       Date:  2012-12-20       Impact factor: 17.173

10.  Brain function overlaps when people observe emblems, speech, and grasping.

Authors:  Michael Andric; Ana Solodkin; Giovanni Buccino; Susan Goldin-Meadow; Giacomo Rizzolatti; Steven L Small
Journal:  Neuropsychologia       Date:  2013-04-11       Impact factor: 3.139

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.