Literature DB >> 25023955

Learned face-voice pairings facilitate visual search.

L Jacob Zweig1, Satoru Suzuki, Marcia Grabowecky.   

Abstract

Voices provide a rich source of information that is important for identifying individuals and for social interaction. During search for a face in a crowd, voices often accompany visual information, and they facilitate localization of the sought-after individual. However, it is unclear whether this facilitation occurs primarily because the voice cues the location of the face or because it also increases the salience of the associated face. Here we demonstrate that a voice that provides no location information nonetheless facilitates visual search for an associated face. We trained novel face-voice associations and verified learning using a two-alternative forced choice task in which participants had to correctly match a presented voice to the associated face. Following training, participants searched for a previously learned target face among other faces while hearing one of the following sounds (localized at the center of the display): a congruent learned voice, an incongruent but familiar voice, an unlearned and unfamiliar voice, or a time-reversed voice. Only the congruent learned voice speeded visual search for the associated face. This result suggests that voices facilitate the visual detection of associated faces, potentially by increasing their visual salience, and that the underlying crossmodal associations can be established through brief training.

Entities:  

Mesh:

Year:  2015        PMID: 25023955      PMCID: PMC4295001          DOI: 10.3758/s13423-014-0685-3

Source DB:  PubMed          Journal:  Psychon Bull Rev        ISSN: 1069-9384


  48 in total

1.  The timing and laminar profile of converging inputs to multisensory areas of the macaque neocortex.

Authors:  Charles E Schroeder; John J Foxe
Journal:  Brain Res Cogn Brain Res       Date:  2002-06

2.  Object familiarity and semantic congruency modulate responses in cortical audiovisual integration areas.

Authors:  Grit Hein; Oliver Doehrmann; Notger G Müller; Jochen Kaiser; Lars Muckli; Marcus J Naumer
Journal:  J Neurosci       Date:  2007-07-25       Impact factor: 6.167

3.  Interaction of face and voice areas during speaker recognition.

Authors:  Katharina von Kriegstein; Andreas Kleinschmidt; Philipp Sterzer; Anne-Lise Giraud
Journal:  J Cogn Neurosci       Date:  2005-03       Impact factor: 3.225

4.  Auditory--visual interaction in single cells in the cortex of the superior temporal sulcus and the orbital frontal cortex of the macaque monkey.

Authors:  L A Benevento; J Fallon; B J Davis; M Rezak
Journal:  Exp Neurol       Date:  1977-12       Impact factor: 5.330

5.  Influences of familiarity on the processing of faces.

Authors:  V Bruce
Journal:  Perception       Date:  1986       Impact factor: 1.490

6.  Direct structural connections between voice- and face-recognition areas.

Authors:  Helen Blank; Alfred Anwander; Katharina von Kriegstein
Journal:  J Neurosci       Date:  2011-09-07       Impact factor: 6.167

7.  Characteristic sounds make you look at target objects more quickly.

Authors:  Lucica Iordanescu; Marcia Grabowecky; Steven Franconeri; Jan Theeuwes; Satoru Suzuki
Journal:  Atten Percept Psychophys       Date:  2010-10       Impact factor: 2.199

8.  Sound alters activity in human V1 in association with illusory visual perception.

Authors:  S Watkins; L Shams; S Tanaka; J-D Haynes; G Rees
Journal:  Neuroimage       Date:  2006-03-23       Impact factor: 6.556

9.  Spatial attention modulates initial afferent activity in human primary visual cortex.

Authors:  Simon P Kelly; Manuel Gomez-Ramirez; John J Foxe
Journal:  Cereb Cortex       Date:  2008-03-04       Impact factor: 5.357

10.  Looking for myself: current multisensory input alters self-face recognition.

Authors:  Manos Tsakiris
Journal:  PLoS One       Date:  2008-12-24       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.