Literature DB >> 17825561

Auditory-visual crossmodal integration in perception of face gender.

Eric L Smith1, Marcia Grabowecky, Satoru Suzuki.   

Abstract

Whereas extensive neuroscientific and behavioral evidence has confirmed a role of auditory-visual integration in representing space [1-6], little is known about the role of auditory-visual integration in object perception. Although recent neuroimaging results suggest integrated auditory-visual object representations [7-11], substantiating behavioral evidence has been lacking. We demonstrated auditory-visual integration in the perception of face gender by using pure tones that are processed in low-level auditory brain areas and that lack the spectral components that characterize human vocalization. When androgynous faces were presented together with pure tones in the male fundamental-speaking-frequency range, faces were more likely to be judged as male, whereas when faces were presented with pure tones in the female fundamental-speaking-frequency range, they were more likely to be judged as female. Importantly, when participants were explicitly asked to attribute gender to these pure tones, their judgments were primarily based on relative pitch and were uncorrelated with the male and female fundamental-speaking-frequency ranges. This perceptual dissociation of absolute-frequency-based crossmodal-integration effects from relative-pitch-based explicit perception of the tones provides evidence for a sensory integration of auditory and visual signals in representing human gender. This integration probably develops because of concurrent neural processing of visual and auditory features of gender.

Entities:  

Mesh:

Year:  2007        PMID: 17825561     DOI: 10.1016/j.cub.2007.08.043

Source DB:  PubMed          Journal:  Curr Biol        ISSN: 0960-9822            Impact factor:   10.834


  17 in total

1.  Integration of auditory and visual information in the recognition of realistic objects.

Authors:  Clara Suied; Nicolas Bonneel; Isabelle Viaud-Delmon
Journal:  Exp Brain Res       Date:  2008-12-18       Impact factor: 1.972

2.  Characteristic sounds facilitate visual search.

Authors:  Lucica Iordanescu; Emmanuel Guzman-Martinez; Marcia Grabowecky; Satoru Suzuki
Journal:  Psychon Bull Rev       Date:  2008-06

3.  Self-awareness affects vision.

Authors:  Eric L Smith; Marcia Grabowecky; Satoru Suzuki
Journal:  Curr Biol       Date:  2008-05-20       Impact factor: 10.834

4.  Learned face-voice pairings facilitate visual search.

Authors:  L Jacob Zweig; Satoru Suzuki; Marcia Grabowecky
Journal:  Psychon Bull Rev       Date:  2015-04

Review 5.  Why we are not all synesthetes (not even weakly so).

Authors:  Ophelia Deroy; Charles Spence
Journal:  Psychon Bull Rev       Date:  2013-08

6.  Are crossmodal correspondences relative or absolute? Sequential effects on speeded classification.

Authors:  Riccardo Brunetti; Allegra Indraccolo; Claudia Del Gatto; Charles Spence; Valerio Santangelo
Journal:  Atten Percept Psychophys       Date:  2018-02       Impact factor: 2.199

7.  Object-based auditory facilitation of visual search for pictures and words with frequent and rare targets.

Authors:  Lucica Iordanescu; Marcia Grabowecky; Satoru Suzuki
Journal:  Acta Psychol (Amst)       Date:  2010-09-22

8.  Sounds exaggerate visual shape.

Authors:  Timothy D Sweeny; Emmanuel Guzman-Martinez; Laura Ortega; Marcia Grabowecky; Satoru Suzuki
Journal:  Cognition       Date:  2012-05-25

9.  Audition dominates vision in duration perception irrespective of salience, attention, and temporal discriminability.

Authors:  Laura Ortega; Emmanuel Guzman-Martinez; Marcia Grabowecky; Satoru Suzuki
Journal:  Atten Percept Psychophys       Date:  2014-07       Impact factor: 2.199

10.  Top-down and bottom-up modulation in processing bimodal face/voice stimuli.

Authors:  Marianne Latinus; Rufin VanRullen; Margot J Taylor
Journal:  BMC Neurosci       Date:  2010-03-11       Impact factor: 3.288

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.