Literature DB >> 15019556

Assessing automaticity in audiovisual speech integration: evidence from the speeded classification task.

Salvador Soto-Faraco1, Jordi Navarra, Agnès Alsius.   

Abstract

The McGurk effect is usually presented as an example of fast, automatic, multisensory integration. We report a series of experiments designed to directly assess these claims. We used a syllabic version of the speeded classification paradigm, whereby response latencies to the first (target) syllable of spoken word-like stimuli are slowed down when the second (irrelevant) syllable varies from trial to trial. This interference effect is interpreted as a failure of selective attention to filter out the irrelevant syllable. In Experiment 1 we reproduced the syllabic interference effect with bimodal stimuli containing auditory as well as visual lip movement information, thus confirming the generalizability of the phenomenon. In subsequent experiments we were able to produce (Experiment 2) and to eliminate (Experiment 3) syllabic interference by introducing 'illusory' (McGurk) audiovisual stimuli in the irrelevant syllable, suggesting that audiovisual integration occurs prior to attentional selection in this paradigm. Copryright 2004 Elsevier B.V.

Entities:  

Mesh:

Year:  2004        PMID: 15019556     DOI: 10.1016/j.cognition.2003.10.005

Source DB:  PubMed          Journal:  Cognition        ISSN: 0010-0277


  30 in total

1.  A common perceptual temporal limit of binding synchronous inputs across different sensory attributes and modalities.

Authors:  Waka Fujisaki; Shin'ya Nishida
Journal:  Proc Biol Sci       Date:  2010-03-24       Impact factor: 5.349

2.  Hearing lips in a second language: visual articulatory information enables the perception of second language sounds.

Authors:  Jordi Navarra; Salvador Soto-Faraco
Journal:  Psychol Res       Date:  2005-12-14

3.  Is the auditory sensory memory sensitive to visual information?

Authors:  Julien Besle; Alexandra Fort; Marie-Hélène Giard
Journal:  Exp Brain Res       Date:  2005-07-23       Impact factor: 1.972

4.  Attention to touch weakens audiovisual speech integration.

Authors:  Agnès Alsius; Jordi Navarra; Salvador Soto-Faraco
Journal:  Exp Brain Res       Date:  2007-11       Impact factor: 1.972

Review 5.  The processing of audio-visual speech: empirical and neural bases.

Authors:  Ruth Campbell
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2008-03-12       Impact factor: 6.237

6.  Searching for audiovisual correspondence in multiple speaker scenarios.

Authors:  Agnès Alsius; Salvador Soto-Faraco
Journal:  Exp Brain Res       Date:  2011-03-23       Impact factor: 1.972

7.  Audiovisual integration as conflict resolution: The conflict of the McGurk illusion.

Authors:  Luis Morís Fernández; Emiliano Macaluso; Salvador Soto-Faraco
Journal:  Hum Brain Mapp       Date:  2017-08-09       Impact factor: 5.038

8.  Brief report: Arrested development of audiovisual speech perception in autism spectrum disorders.

Authors:  Ryan A Stevenson; Justin K Siemann; Tiffany G Woynaroski; Brittany C Schneider; Haley E Eberly; Stephen M Camarata; Mark T Wallace
Journal:  J Autism Dev Disord       Date:  2014-06

9.  Audio-visual speech cue combination.

Authors:  Derek H Arnold; Morgan Tear; Ryan Schindel; Warrick Roseboom
Journal:  PLoS One       Date:  2010-04-16       Impact factor: 3.240

10.  Phonetic recalibration does not depend on working memory.

Authors:  Martijn Baart; Jean Vroomen
Journal:  Exp Brain Res       Date:  2010-05-01       Impact factor: 1.972

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.