| Literature DB >> 31323242 |
Elina Kaplan1, Alexandra Jesse2.
Abstract
In face-to-face conversations, when listeners process and combine information obtained from hearing and seeing a speaker, they mostly look at the eyes rather than at the more informative mouth region. Measuring event-related potentials, we tested whether fixating the speaker's eyes is sufficient for gathering enough visual speech information to modulate early auditory processing, or whether covert attention to the speaker's mouth is needed. Results showed that when listeners fixated the eye region of the speaker, the amplitudes of the auditory evoked N1 and P2 were reduced when listeners heard and saw the speaker than when they only heard her. These cross-modal interactions also occurred when, in addition, attention was restricted to the speaker's eye region. Fixating the speaker's eyes thus provides listeners with sufficient visual information to facilitate early auditory processing. The spread of covert attention to the mouth area is not needed to observe audiovisual interactions.Entities:
Keywords: Attention; Audiovisual speech perception; Event-related potentials; Multisensory processing
Mesh:
Year: 2019 PMID: 31323242 PMCID: PMC6719704 DOI: 10.1016/j.biopsycho.2019.107724
Source DB: PubMed Journal: Biol Psychol ISSN: 0301-0511 Impact factor: 3.251