Literature DB >> 28601721

A universal bias in adult vowel perception - By ear or by eye.

Matthew Masapollo1, Linda Polka2, Lucie Ménard3.   

Abstract

Speech perceivers are universally biased toward "focal" vowels (i.e., vowels whose adjacent formants are close in frequency, which concentrates acoustic energy into a narrower spectral region). This bias is demonstrated in phonetic discrimination tasks as a directional asymmetry: a change from a relatively less to a relatively more focal vowel results in significantly better performance than a change in the reverse direction. We investigated whether the critical information for this directional effect is limited to the auditory modality, or whether visible articulatory information provided by the speaker's face also plays a role. Unimodal auditory and visual as well as bimodal (auditory-visual) vowel stimuli were created from video recordings of a speaker producing variants of /u/, differing in both their degree of focalization and visible lip rounding (i.e., lip compression and protrusion). In Experiment 1, we confirmed that subjects showed an asymmetry while discriminating the auditory vowel stimuli. We then found, in Experiment 2, a similar asymmetry when subjects lip-read those same vowels. In Experiment 3, we found asymmetries, comparable to those found for unimodal vowels, for bimodal vowels when the audio and visual channels were phonetically-congruent. In contrast, when the audio and visual channels were phonetically-incongruent (as in the "McGurk effect"), this asymmetry was disrupted. These findings collectively suggest that the perceptual processes underlying the "focal" vowel bias are sensitive to articulatory information available across sensory modalities, and raise foundational issues concerning the extent to which vowel perception derives from general-auditory or speech-gesture-specific processes.
Copyright © 2017 Elsevier B.V. All rights reserved.

Entities:  

Keywords:  Multisensory; Natural referent vowel framework; Prototypes; Speech perception; Universals; Vowels

Mesh:

Year:  2017        PMID: 28601721     DOI: 10.1016/j.cognition.2017.06.001

Source DB:  PubMed          Journal:  Cognition        ISSN: 0010-0277


  4 in total

1.  Effects of formant proximity and stimulus prototypicality on the neural discrimination of vowels: Evidence from the auditory frequency-following response.

Authors:  T Christina Zhao; Matthew Masapollo; Linda Polka; Lucie Ménard; Patricia K Kuhl
Journal:  Brain Lang       Date:  2019-05-23       Impact factor: 2.381

2.  Asymmetries in unimodal visual vowel perception: The roles of oral-facial kinematics, orientation, and configuration.

Authors:  Matthew Masapollo; Linda Polka; Lucie Ménard; Lauren Franklin; Mark Tiede; James Morgan
Journal:  J Exp Psychol Hum Percept Perform       Date:  2018-03-08       Impact factor: 3.332

3.  Editorial: Phonological Representations and Mismatch Negativity Asymmetries.

Authors:  Arild Hestvik; Mathias Scharinger; Valerie L Shafer; Aditi Lahiri
Journal:  Front Hum Neurosci       Date:  2022-06-01       Impact factor: 3.473

4.  Neurophysiological Correlates of Asymmetries in Vowel Perception: An English-French Cross-Linguistic Event-Related Potential Study.

Authors:  Linda Polka; Monika Molnar; T Christina Zhao; Matthew Masapollo
Journal:  Front Hum Neurosci       Date:  2021-06-03       Impact factor: 3.473

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.