Literature DB >> 9718953

Eye movement of perceivers during audiovisual speech perception.

E Vatikiotis-Bateson1, I M Eigsti, S Yano, K G Munhall.   

Abstract

Perceiver eye movements were recorded during audiovisual presentations of extended monologues. Monologues were presented at different image sizes and with different levels of acoustic masking noise. Two clear targets of gaze fixation were identified, the eyes and the mouth. Regardless of image size, perceivers of both Japanese and English gazed more at the mouth as masking noise levels increased. However, even at the highest noise levels and largest image sizes, subjects gazed at the mouth only about half the time. For the eye target, perceivers typically gazed at one eye more than the other, and the tendency became stronger at higher noise levels. English perceivers displayed more variety of gaze-sequence patterns (e.g., left eye to mouth to left eye to right eye) and persisted in using them at higher noise levels than did Japanese perceivers. No segment-level correlations were found between perceiver eye motions and phoneme identity of the stimuli.

Entities:  

Mesh:

Year:  1998        PMID: 9718953     DOI: 10.3758/bf03211929

Source DB:  PubMed          Journal:  Percept Psychophys        ISSN: 0031-5117


  46 in total

1.  Infants deploy selective attention to the mouth of a talking face when learning speech.

Authors:  David J Lewkowicz; Amy M Hansen-Tift
Journal:  Proc Natl Acad Sci U S A       Date:  2012-01-17       Impact factor: 11.205

2.  Different neural frequency bands integrate faces and voices differently in the superior temporal sulcus.

Authors:  Chandramouli Chandrasekaran; Asif A Ghazanfar
Journal:  J Neurophysiol       Date:  2008-11-26       Impact factor: 2.714

3.  The effect of varying talker identity and listening conditions on gaze behavior during audiovisual speech perception.

Authors:  Julie N Buchan; Martin Paré; Kevin G Munhall
Journal:  Brain Res       Date:  2008-06-28       Impact factor: 3.252

4.  Acoustic noise and vision differentially warp the auditory categorization of speech.

Authors:  Gavin M Bidelman; Lauren Sigley; Gwyneth A Lewis
Journal:  J Acoust Soc Am       Date:  2019-07       Impact factor: 1.840

5.  Free viewing of talking faces reveals mouth and eye preferring regions of the human superior temporal sulcus.

Authors:  Johannes Rennig; Michael S Beauchamp
Journal:  Neuroimage       Date:  2018-08-06       Impact factor: 6.556

6.  Spatial Frequency Requirements and Gaze Strategy in Visual-Only and Audiovisual Speech Perception.

Authors:  Amanda H Wilson; Agnès Alsius; Martin Paré; Kevin G Munhall
Journal:  J Speech Lang Hear Res       Date:  2016-08-01       Impact factor: 2.297

7.  Musicians have enhanced audiovisual multisensory binding: experience-dependent effects in the double-flash illusion.

Authors:  Gavin M Bidelman
Journal:  Exp Brain Res       Date:  2016-06-22       Impact factor: 1.972

8.  "Look who's talking!" Gaze Patterns for Implicit and Explicit Audio-Visual Speech Synchrony Detection in Children With High-Functioning Autism.

Authors:  Ruth B Grossman; Erin Steinhart; Teresa Mitchell; William McIlvane
Journal:  Autism Res       Date:  2015-01-24       Impact factor: 5.216

9.  Bilingualism modulates infants' selective attention to the mouth of a talking face.

Authors:  Ferran Pons; Laura Bosch; David J Lewkowicz
Journal:  Psychol Sci       Date:  2015-03-12

10.  The natural statistics of audiovisual speech.

Authors:  Chandramouli Chandrasekaran; Andrea Trubanova; Sébastien Stillittano; Alice Caplier; Asif A Ghazanfar
Journal:  PLoS Comput Biol       Date:  2009-07-17       Impact factor: 4.475

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.