Literature DB >> 18633803

Spatial statistics of gaze fixations during dynamic face processing.

Julie N Buchan1, Martin Paré, Kevin G Munhall.   

Abstract

Social interaction involves the active visual perception of facial expressions and communicative gestures. This study examines the distribution of gaze fixations while watching videos of expressive talking faces. The knowledge-driven factors that influence the selective visual processing of facial information were examined by using the same set of stimuli, and assigning subjects to either a speech recognition task or an emotion judgment task. For half of the subjects assigned to each of the tasks, the intelligibility of the speech was manipulated by the addition of moderate masking noise. Both tasks and the intelligibility of the speech signal influenced the spatial distribution of gaze. Gaze was concentrated more on the eyes when emotion was being judged as compared to when words were being identified. When noise was added to the acoustic signal, gaze in both tasks was more centralized on the face. This shows that subject's gaze is sensitive to the distribution of information on the face, but can also be influenced by strategies aimed at maximizing the amount of visual information processed.

Mesh:

Year:  2007        PMID: 18633803     DOI: 10.1080/17470910601043644

Source DB:  PubMed          Journal:  Soc Neurosci        ISSN: 1747-0919            Impact factor:   2.083


  32 in total

1.  Audiovisual speech perception and eye gaze behavior of adults with asperger syndrome.

Authors:  Satu Saalasti; Jari Kätsyri; Kaisa Tiippana; Mari Laine-Hernandez; Lennart von Wendt; Mikko Sams
Journal:  J Autism Dev Disord       Date:  2012-08

2.  The effect of varying talker identity and listening conditions on gaze behavior during audiovisual speech perception.

Authors:  Julie N Buchan; Martin Paré; Kevin G Munhall
Journal:  Brain Res       Date:  2008-06-28       Impact factor: 3.252

3.  A link between individual differences in multisensory speech perception and eye movements.

Authors:  Demet Gurler; Nathan Doyle; Edgar Walker; John Magnotti; Michael Beauchamp
Journal:  Atten Percept Psychophys       Date:  2015-05       Impact factor: 2.199

4.  Dog owners show experience-based viewing behaviour in judging dog face approachability.

Authors:  Carla Jade Gavin; Sarah Houghton; Kun Guo
Journal:  Psychol Res       Date:  2015-10-20

5.  Spatial Frequency Requirements and Gaze Strategy in Visual-Only and Audiovisual Speech Perception.

Authors:  Amanda H Wilson; Agnès Alsius; Martin Paré; Kevin G Munhall
Journal:  J Speech Lang Hear Res       Date:  2016-08-01       Impact factor: 2.297

6.  "Look who's talking!" Gaze Patterns for Implicit and Explicit Audio-Visual Speech Synchrony Detection in Children With High-Functioning Autism.

Authors:  Ruth B Grossman; Erin Steinhart; Teresa Mitchell; William McIlvane
Journal:  Autism Res       Date:  2015-01-24       Impact factor: 5.216

7.  Looking Behavior and Audiovisual Speech Understanding in Children With Normal Hearing and Children With Mild Bilateral or Unilateral Hearing Loss.

Authors:  Dawna E Lewis; Nicholas A Smith; Jody L Spalding; Daniel L Valente
Journal:  Ear Hear       Date:  2018 Jul/Aug       Impact factor: 3.570

8.  Psychobiological Responses Reveal Audiovisual Noise Differentially Challenges Speech Recognition.

Authors:  Gavin M Bidelman; Bonnie Brown; Kelsey Mankel; Caitlin Nelms Price
Journal:  Ear Hear       Date:  2020 Mar/Apr       Impact factor: 3.570

9.  Loss of Central Vision and Audiovisual Speech Perception.

Authors:  Amanda Wilson; Adam Wilson; Martin W Ten Hove; Martin Paré; Kevin G Munhall
Journal:  Vis Impair Res       Date:  2008

10.  Looking just below the eyes is optimal across face recognition tasks.

Authors:  Matthew F Peterson; Miguel P Eckstein
Journal:  Proc Natl Acad Sci U S A       Date:  2012-11-12       Impact factor: 11.205

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.