Literature DB >> 16448641

Eye movements of monkey observers viewing vocalizing conspecifics.

Asif A Ghazanfar1, Kristina Nielsen, Nikos K Logothetis.   

Abstract

Primates, including humans, communicate using facial expressions, vocalizations and often a combination of the two modalities. For humans, such bimodal integration is best exemplified by speech-reading - humans readily use facial cues to enhance speech comprehension, particularly in noisy environments. Studies of the eye movement patterns of human speech-readers have revealed, unexpectedly, that they predominantly fixate on the eye region of the face as opposed to the mouth. Here, we tested the evolutionary basis for such a behavioral strategy by examining the eye movements of rhesus monkeys observers as they viewed vocalizing conspecifics. Under a variety of listening conditions, we found that rhesus monkeys predominantly focused on the eye region versus the mouth and that fixations on the mouth were tightly correlated with the onset of mouth movements. These eye movement patterns of rhesus monkeys are strikingly similar to those reported for humans observing the visual components of speech. The data therefore suggest that the sensorimotor strategies underlying bimodal speech perception may have a homologous counterpart in a closely related primate ancestor.

Entities:  

Mesh:

Year:  2006        PMID: 16448641     DOI: 10.1016/j.cognition.2005.12.007

Source DB:  PubMed          Journal:  Cognition        ISSN: 0010-0277


  27 in total

1.  Visuoauditory mappings between high luminance and high pitch are shared by chimpanzees (Pan troglodytes) and humans.

Authors:  Vera U Ludwig; Ikuma Adachi; Tetsuro Matsuzawa
Journal:  Proc Natl Acad Sci U S A       Date:  2011-12-05       Impact factor: 11.205

2.  A parameterized digital 3D model of the Rhesus macaque face for investigating the visual processing of social cues.

Authors:  Aidan P Murphy; David A Leopold
Journal:  J Neurosci Methods       Date:  2019-06-20       Impact factor: 2.390

3.  Different neural frequency bands integrate faces and voices differently in the superior temporal sulcus.

Authors:  Chandramouli Chandrasekaran; Asif A Ghazanfar
Journal:  J Neurophysiol       Date:  2008-11-26       Impact factor: 2.714

4.  Dynamic faces speed up the onset of auditory cortical spiking responses during vocal detection.

Authors:  Chandramouli Chandrasekaran; Luis Lemus; Asif A Ghazanfar
Journal:  Proc Natl Acad Sci U S A       Date:  2013-11-11       Impact factor: 11.205

5.  Interactions between the superior temporal sulcus and auditory cortex mediate dynamic face/voice integration in rhesus monkeys.

Authors:  Asif A Ghazanfar; Chandramouli Chandrasekaran; Nikos K Logothetis
Journal:  J Neurosci       Date:  2008-04-23       Impact factor: 6.167

6.  Videos of conspecifics elicit interactive looking patterns and facial expressions in monkeys.

Authors:  Clayton P Mosher; Prisca E Zimmerman; Katalin M Gothard
Journal:  Behav Neurosci       Date:  2011-08       Impact factor: 1.912

Review 7.  The primate amygdala in social perception - insights from electrophysiological recordings and stimulation.

Authors:  Ueli Rutishauser; Adam N Mamelak; Ralph Adolphs
Journal:  Trends Neurosci       Date:  2015-04-03       Impact factor: 13.837

8.  Neurons responsive to face-view in the primate ventrolateral prefrontal cortex.

Authors:  L M Romanski; M M Diehl
Journal:  Neuroscience       Date:  2011-05-13       Impact factor: 3.590

9.  The natural statistics of audiovisual speech.

Authors:  Chandramouli Chandrasekaran; Andrea Trubanova; Sébastien Stillittano; Alice Caplier; Asif A Ghazanfar
Journal:  PLoS Comput Biol       Date:  2009-07-17       Impact factor: 4.475

Review 10.  The multisensory roles for auditory cortex in primate vocal communication.

Authors:  Asif A Ghazanfar
Journal:  Hear Res       Date:  2009-04-14       Impact factor: 3.208

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.