Literature DB >> 26045213

Using EEG and stimulus context to probe the modelling of auditory-visual speech.

Tim Paris1, Jeesun Kim2, Chris Davis2.   

Abstract

We investigated whether internal models of the relationship between lip movements and corresponding speech sounds [Auditory-Visual (AV) speech] could be updated via experience. AV associations were indexed by early and late event related potentials (ERPs) and by oscillatory power and phase locking. Different AV experience was produced via a context manipulation. Participants were presented with valid (the conventional pairing) and invalid AV speech items in either a 'reliable' context (80% AVvalid items) or an 'unreliable' context (80% AVinvalid items). The results showed that for the reliable context, there was N1 facilitation for AV compared to auditory only speech. This N1 facilitation was not affected by AV validity. Later ERPs showed a difference in amplitude between valid and invalid AV speech and there was significant enhancement of power for valid versus invalid AV speech. These response patterns did not change over the context manipulation, suggesting that the internal models of AV speech were not updated by experience. The results also showed that the facilitation of N1 responses did not vary as a function of the salience of visual speech (as previously reported); in post-hoc analyses, it appeared instead that N1 facilitation varied according to the relative time of the acoustic onset, suggesting for AV events N1 may be more sensitive to the relationship of AV timing than form. Crown
Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

Entities:  

Keywords:  Audiovisual speech; Context; ERP; Internal model; Oscillations

Mesh:

Year:  2015        PMID: 26045213     DOI: 10.1016/j.cortex.2015.03.010

Source DB:  PubMed          Journal:  Cortex        ISSN: 0010-9452            Impact factor:   4.027


  4 in total

1.  Fixating the eyes of a speaker provides sufficient visual information to modulate early auditory processing.

Authors:  Elina Kaplan; Alexandra Jesse
Journal:  Biol Psychol       Date:  2019-07-16       Impact factor: 3.251

2.  Musical Expertise Affects Audiovisual Speech Perception: Findings From Event-Related Potentials and Inter-trial Phase Coherence.

Authors:  Marzieh Sorati; Dawn Marie Behne
Journal:  Front Psychol       Date:  2019-11-15

3.  Considerations in Audio-Visual Interaction Models: An ERP Study of Music Perception by Musicians and Non-musicians.

Authors:  Marzieh Sorati; Dawn M Behne
Journal:  Front Psychol       Date:  2021-01-20

4.  Audiovisual Modulation in Music Perception for Musicians and Non-musicians.

Authors:  Marzieh Sorati; Dawn Marie Behne
Journal:  Front Psychol       Date:  2020-05-29
  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.