Literature DB >> 19583474

Visual anticipatory information modulates multisensory interactions of artificial audiovisual stimuli.

Jean Vroomen1, Jeroen J Stekelenburg.   

Abstract

The neural activity of speech sound processing (the N1 component of the auditory ERP) can be suppressed if a speech sound is accompanied by concordant lip movements. Here we demonstrate that this audiovisual interaction is neither speech specific nor linked to humanlike actions but can be observed with artificial stimuli if their timing is made predictable. In Experiment 1, a pure tone synchronized with a deformation of a rectangle induced a smaller auditory N1 than auditory-only presentations if the temporal occurrence of this audiovisual event was made predictable by two moving disks that touched the rectangle. Local autoregressive average source estimation indicated that this audiovisual interaction may be related to integrative processing in auditory areas. When the moving disks did not precede the audiovisual stimulus--making the onset unpredictable--there was no N1 reduction. In Experiment 2, the predictability of the leading visual signal was manipulated by introducing a temporal asynchrony between the audiovisual event and the collision of moving disks. Audiovisual events occurred either at the moment, before (too "early"), or after (too "late") the disks collided on the rectangle. When asynchronies varied from trial to trial--rendering the moving disks unreliable temporal predictors of the audiovisual event--the N1 reduction was abolished. These results demonstrate that the N1 suppression is induced by visual information that both precedes and reliably predicts audiovisual onset, without a necessary link to human action-related neural mechanisms.

Entities:  

Mesh:

Year:  2010        PMID: 19583474     DOI: 10.1162/jocn.2009.21308

Source DB:  PubMed          Journal:  J Cogn Neurosci        ISSN: 0898-929X            Impact factor:   3.225


  48 in total

1.  Multistage audiovisual integration of speech: dissociating identification and detection.

Authors:  Kasper Eskelund; Jyrki Tuomainen; Tobias S Andersen
Journal:  Exp Brain Res       Date:  2010-12-25       Impact factor: 1.972

2.  Children with a history of SLI show reduced sensitivity to audiovisual temporal asynchrony: an ERP study.

Authors:  Natalya Kaganovich; Jennifer Schumaker; Laurence B Leonard; Dana Gustafson; Danielle Macias
Journal:  J Speech Lang Hear Res       Date:  2014-08       Impact factor: 2.297

Review 3.  Attention and prediction in human audition: a lesson from cognitive psychophysiology.

Authors:  Erich Schröger; Anna Marzecová; Iria SanMiguel
Journal:  Eur J Neurosci       Date:  2015-03       Impact factor: 3.386

4.  Contextual control of audiovisual integration in low-level sensory cortices.

Authors:  Nienke M van Atteveldt; Bradley S Peterson; Charles E Schroeder
Journal:  Hum Brain Mapp       Date:  2013-08-24       Impact factor: 5.038

Review 5.  Multisensory integration: flexible use of general operations.

Authors:  Nienke van Atteveldt; Micah M Murray; Gregor Thut; Charles E Schroeder
Journal:  Neuron       Date:  2014-03-19       Impact factor: 17.173

Review 6.  Audiotactile interactions in temporal perception.

Authors:  Valeria Occelli; Charles Spence; Massimiliano Zampini
Journal:  Psychon Bull Rev       Date:  2011-06

7.  Emotion and goal-directed behavior: ERP evidence on cognitive and emotional conflict.

Authors:  Artyom Zinchenko; Philipp Kanske; Christian Obermeier; Erich Schröger; Sonja A Kotz
Journal:  Soc Cogn Affect Neurosci       Date:  2015-04-28       Impact factor: 3.436

8.  Inverse effectiveness and multisensory interactions in visual event-related potentials with audiovisual speech.

Authors:  Ryan A Stevenson; Maxim Bushmakin; Sunah Kim; Mark T Wallace; Aina Puce; Thomas W James
Journal:  Brain Topogr       Date:  2012-02-25       Impact factor: 3.020

9.  The role of emotion in dynamic audiovisual integration of faces and voices.

Authors:  Jenny Kokinous; Sonja A Kotz; Alessandro Tavano; Erich Schröger
Journal:  Soc Cogn Affect Neurosci       Date:  2014-08-20       Impact factor: 3.436

10.  Cross-modal prediction in speech depends on prior linguistic experience.

Authors:  Carolina Sánchez-García; James T Enns; Salvador Soto-Faraco
Journal:  Exp Brain Res       Date:  2013-02-06       Impact factor: 1.972

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.