Literature DB >> 31323242

Fixating the eyes of a speaker provides sufficient visual information to modulate early auditory processing.

Elina Kaplan1, Alexandra Jesse2.   

Abstract

In face-to-face conversations, when listeners process and combine information obtained from hearing and seeing a speaker, they mostly look at the eyes rather than at the more informative mouth region. Measuring event-related potentials, we tested whether fixating the speaker's eyes is sufficient for gathering enough visual speech information to modulate early auditory processing, or whether covert attention to the speaker's mouth is needed. Results showed that when listeners fixated the eye region of the speaker, the amplitudes of the auditory evoked N1 and P2 were reduced when listeners heard and saw the speaker than when they only heard her. These cross-modal interactions also occurred when, in addition, attention was restricted to the speaker's eye region. Fixating the speaker's eyes thus provides listeners with sufficient visual information to facilitate early auditory processing. The spread of covert attention to the mouth area is not needed to observe audiovisual interactions.
Copyright © 2019 Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Attention; Audiovisual speech perception; Event-related potentials; Multisensory processing

Mesh:

Year:  2019        PMID: 31323242      PMCID: PMC6719704          DOI: 10.1016/j.biopsycho.2019.107724

Source DB:  PubMed          Journal:  Biol Psychol        ISSN: 0301-0511            Impact factor:   3.251


  47 in total

1.  Covert and overt voluntary attention: linked or independent?

Authors:  Amelia R Hunt; Alan Kingstone
Journal:  Brain Res Cogn Brain Res       Date:  2003-12

2.  The influence of conceptual knowledge on visual discrimination.

Authors:  Isabel Gauthier; Thomas W James; Kim M Curby; Michael J Tarr
Journal:  Cogn Neuropsychol       Date:  2003-05-01       Impact factor: 2.468

3.  Audio-visual speech perception off the top of the head.

Authors:  Chris Davis; Jeesun Kim
Journal:  Cognition       Date:  2005-11-08

4.  Neural correlates of multisensory integration of ecologically valid audiovisual events.

Authors:  Jeroen J Stekelenburg; Jean Vroomen
Journal:  J Cogn Neurosci       Date:  2007-12       Impact factor: 3.225

5.  Electrophysiological evidence for speech-specific audiovisual integration.

Authors:  Martijn Baart; Jeroen J Stekelenburg; Jean Vroomen
Journal:  Neuropsychologia       Date:  2013-11-27       Impact factor: 3.139

6.  Using EEG and stimulus context to probe the modelling of auditory-visual speech.

Authors:  Tim Paris; Jeesun Kim; Chris Davis
Journal:  Cortex       Date:  2015-04-17       Impact factor: 4.027

7.  Allocation of attention in the visual field.

Authors:  C W Eriksen; Y Y Yeh
Journal:  J Exp Psychol Hum Percept Perform       Date:  1985-10       Impact factor: 3.332

8.  Quantifying the contribution of vision to speech perception in noise.

Authors:  A MacLeod; Q Summerfield
Journal:  Br J Audiol       Date:  1987-05

9.  "Hey John": signals conveying communicative intention toward the self activate brain regions associated with "mentalizing," regardless of modality.

Authors:  Knut K W Kampe; Chris D Frith; Uta Frith
Journal:  J Neurosci       Date:  2003-06-15       Impact factor: 6.167

10.  ERPLAB: an open-source toolbox for the analysis of event-related potentials.

Authors:  Javier Lopez-Calderon; Steven J Luck
Journal:  Front Hum Neurosci       Date:  2014-04-14       Impact factor: 3.169

View more
  1 in total

1.  Lip-Reading Enables the Brain to Synthesize Auditory Features of Unknown Silent Speech.

Authors:  Mathieu Bourguignon; Martijn Baart; Efthymia C Kapnoula; Nicola Molinaro
Journal:  J Neurosci       Date:  2019-12-30       Impact factor: 6.167

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.