Literature DB >> 16034571

Automatic audiovisual integration in speech perception.

Maurizio Gentilucci1, Luigi Cattaneo.   

Abstract

Two experiments aimed to determine whether features of both the visual and acoustical inputs are always merged into the perceived representation of speech and whether this audiovisual integration is based on either cross-modal binding functions or on imitation. In a McGurk paradigm, observers were required to repeat aloud a string of phonemes uttered by an actor (acoustical presentation of phonemic string) whose mouth, in contrast, mimicked pronunciation of a different string (visual presentation). In a control experiment participants read the same printed strings of letters. This condition aimed to analyze the pattern of voice and the lip kinematics controlling for imitation. In the control experiment and in the congruent audiovisual presentation, i.e. when the articulation mouth gestures were congruent with the emission of the string of phones, the voice spectrum and the lip kinematics varied according to the pronounced strings of phonemes. In the McGurk paradigm the participants were unaware of the incongruence between visual and acoustical stimuli. The acoustical analysis of the participants' spoken responses showed three distinct patterns: the fusion of the two stimuli (the McGurk effect), repetition of the acoustically presented string of phonemes, and, less frequently, of the string of phonemes corresponding to the mouth gestures mimicked by the actor. However, the analysis of the latter two responses showed that the formant 2 of the participants' voice spectra always differed from the value recorded in the congruent audiovisual presentation. It approached the value of the formant 2 of the string of phonemes presented in the other modality, which was apparently ignored. The lip kinematics of the participants repeating the string of phonemes acoustically presented were influenced by the observation of the lip movements mimicked by the actor, but only when pronouncing a labial consonant. The data are discussed in favor of the hypothesis that features of both the visual and acoustical inputs always contribute to the representation of a string of phonemes and that cross-modal integration occurs by extracting mouth articulation features peculiar for the pronunciation of that string of phonemes.

Entities:  

Mesh:

Year:  2005        PMID: 16034571     DOI: 10.1007/s00221-005-0008-z

Source DB:  PubMed          Journal:  Exp Brain Res        ISSN: 0014-4819            Impact factor:   1.972


  26 in total

1.  Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex.

Authors:  G A Calvert; R Campbell; M J Brammer
Journal:  Curr Biol       Date:  2000-06-01       Impact factor: 10.834

2.  Neural mechanisms of empathy in humans: a relay from neural systems for imitation to limbic areas.

Authors:  Laurie Carr; Marco Iacoboni; Marie-Charlotte Dubeau; John C Mazziotta; Gian Luigi Lenzi
Journal:  Proc Natl Acad Sci U S A       Date:  2003-04-07       Impact factor: 11.205

Review 3.  Functional MRI of language: new approaches to understanding the cortical organization of semantic processing.

Authors:  Susan Bookheimer
Journal:  Annu Rev Neurosci       Date:  2002-03-19       Impact factor: 12.449

Review 4.  Lipreading and audio-visual speech perception.

Authors:  Q Summerfield
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  1992-01-29       Impact factor: 6.237

5.  Neural circuits involved in the recognition of actions performed by nonconspecifics: an FMRI study.

Authors:  Giovanni Buccino; Fausta Lui; Nicola Canessa; Ilaria Patteri; Giovanna Lagravinese; Francesca Benuzzi; Carlo A Porro; Giacomo Rizzolatti
Journal:  J Cogn Neurosci       Date:  2004 Jan-Feb       Impact factor: 3.225

6.  Action observation activates premotor and parietal areas in a somatotopic manner: an fMRI study.

Authors:  G Buccino; F Binkofski; G R Fink; L Fadiga; L Fogassi; V Gallese; R J Seitz; K Zilles; G Rizzolatti; H J Freund
Journal:  Eur J Neurosci       Date:  2001-01       Impact factor: 3.386

7.  Mandarin speech perception by ear and eye follows a universal principle.

Authors:  Trevor H Chen; Dominic W Massaro
Journal:  Percept Psychophys       Date:  2004-07

8.  The motor theory of speech perception revised.

Authors:  A M Liberman; I G Mattingly
Journal:  Cognition       Date:  1985-10

9.  Execution and observation of bringing a fruit to the mouth affect syllable pronunciation.

Authors:  Maurizio Gentilucci; Paola Santunione; Alice C Roy; Silvia Stefanini
Journal:  Eur J Neurosci       Date:  2004-01       Impact factor: 3.386

10.  The essential role of Broca's area in imitation.

Authors:  Marc Heiser; Marco Iacoboni; Fumiko Maeda; Jake Marcus; John C Mazziotta
Journal:  Eur J Neurosci       Date:  2003-03       Impact factor: 3.386

View more
  17 in total

1.  Neural correlates of interindividual differences in children's audiovisual speech perception.

Authors:  Audrey R Nath; Eswen E Fava; Michael S Beauchamp
Journal:  J Neurosci       Date:  2011-09-28       Impact factor: 6.167

2.  Hearing lips and seeing voices: how cortical areas supporting speech production mediate audiovisual speech perception.

Authors:  Jeremy I Skipper; Virginie van Wassenhove; Howard C Nusbaum; Steven L Small
Journal:  Cereb Cortex       Date:  2007-01-11       Impact factor: 5.357

3.  When flavor guides motor control: an effector independence study.

Authors:  Valentina Parma; Roberto Roverato; Deborah Ghirardello; Maria Bulgheroni; Roberto Tirindelli; Umberto Castiello
Journal:  Exp Brain Res       Date:  2011-05-27       Impact factor: 1.972

4.  Silent articulation modulates auditory and audiovisual speech perception.

Authors:  Marc Sato; Emilie Troille; Lucie Ménard; Marie-Agnès Cathiard; Vincent Gracco
Journal:  Exp Brain Res       Date:  2013-04-17       Impact factor: 1.972

5.  Multisensory speech perception in children with autism spectrum disorders.

Authors:  Tiffany G Woynaroski; Leslie D Kwakye; Jennifer H Foss-Feig; Ryan A Stevenson; Wendy L Stone; Mark T Wallace
Journal:  J Autism Dev Disord       Date:  2013-12

6.  A neural basis for interindividual differences in the McGurk effect, a multisensory speech illusion.

Authors:  Audrey R Nath; Michael S Beauchamp
Journal:  Neuroimage       Date:  2011-07-20       Impact factor: 6.556

7.  The Bilingual Language Interaction Network for Comprehension of Speech.

Authors:  Anthony Shook; Viorica Marian
Journal:  Biling (Camb Engl)       Date:  2013-04-01

8.  Speech Perception as a Multimodal Phenomenon.

Authors:  Lawrence D Rosenblum
Journal:  Curr Dir Psychol Sci       Date:  2008-12

9.  Please say what this word is-Vowel-extrinsic normalization in the sensorimotor control of speech.

Authors:  Nicolas J Bourguignon; Shari R Baum; Douglas M Shiller
Journal:  J Exp Psychol Hum Percept Perform       Date:  2016-01-28       Impact factor: 3.332

10.  The allocation of attention to learning of goal-directed actions: a cognitive neuroscience framework focusing on the Basal Ganglia.

Authors:  E A Franz
Journal:  Front Psychol       Date:  2012-12-21
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.