Literature DB >> 19354407

Estimating vowel formant discrimination thresholds using a single-interval classification task.

Eric Oglesbee1, Diane Kewley-Port.   

Abstract

Previous research estimating vowel formant discrimination thresholds in words and sentences has often employed a modified two-alternative-forced-choice (2AFC) task with adaptive tracking. Although this approach has produced stable data, the length and number of experimental sessions, as well as the unnaturalness of the task, limit generalizations of results to ordinary speech communication. In this exploratory study, a typical identification task was used to estimate vowel formant discrimination thresholds. Specifically, a signal detection theory approach was used to develop a method to estimate vowel formant discrimination thresholds from a quicker, more natural single-interval classification task. In experiment 1 "classification thresholds" for words in isolation and embedded in sentences were compared to previously collected 2AFC data. Experiment 2 used a within-subjects design to compare thresholds estimated from both classification and 2AFC tasks. Due to instabilities observed in the experiment 1 sentence data, experiment 2 examined only isolated words. Results from these experiments show that for isolated words, thresholds estimated using the classification procedure are comparable to those estimated using the 2AFC task. These results, as well as an analysis of several aspects of the classification procedure, support the viability of this new approach for estimating discrimination thresholds for speech stimuli.

Mesh:

Year:  2009        PMID: 19354407      PMCID: PMC2736738          DOI: 10.1121/1.3086269

Source DB:  PubMed          Journal:  J Acoust Soc Am        ISSN: 0001-4966            Impact factor:   1.840


  17 in total

1.  Vowel formant discrimination II: Effects of stimulus uncertainty, consonantal context, and training.

Authors:  D Kewley-Port
Journal:  J Acoust Soc Am       Date:  2001-10       Impact factor: 1.840

2.  Neighboring spectral content influences vowel identification.

Authors:  L L Holt; A J Lotto; K R Kluender
Journal:  J Acoust Soc Am       Date:  2000-08       Impact factor: 1.840

3.  Categorical perception depends on the discrimination task.

Authors:  E Gerrits; M E H Schouten
Journal:  Percept Psychophys       Date:  2004-04

4.  Categorization and discrimination of nonspeech sounds: differences between steady-state and rapidly-changing acoustic cues.

Authors:  Daniel Mirman; Lori L Holt; James L McClelland
Journal:  J Acoust Soc Am       Date:  2004-08       Impact factor: 1.840

5.  Vowel perception by noise masked normal-hearing young adults.

Authors:  Carolyn Richie; Diane Kewley-Port; Maureen Coughlin
Journal:  J Acoust Soc Am       Date:  2005-08       Impact factor: 1.840

6.  Auditory short-term memory and vowel perception.

Authors:  D B Pisoni
Journal:  Mem Cognit       Date:  1975-01

7.  Auditory and phonetic memory codes in the discrimination of consonants and vowels.

Authors:  David B Pisoni
Journal:  Percept Psychophys       Date:  1973-06-01

8.  Formant-frequency discrimination for isolated English vowels.

Authors:  D Kewley-Port; C S Watson
Journal:  J Acoust Soc Am       Date:  1994-01       Impact factor: 1.840

9.  Discrimination and identification of vowels by young, hearing-impaired adults.

Authors:  Carolyn Richie; Diane Kewley-Port; Maureen Coughlin
Journal:  J Acoust Soc Am       Date:  2003-11       Impact factor: 1.840

10.  The relation between identification and discrimination of vowels in young and elderly listeners.

Authors:  M Coughlin; D Kewley-Port; L E Humes
Journal:  J Acoust Soc Am       Date:  1998-12       Impact factor: 1.840

View more
  1 in total

1.  The neural encoding of formant frequencies contributing to vowel identification in normal-hearing listeners.

Authors:  Jong Ho Won; Kelly Tremblay; Christopher G Clinard; Richard A Wright; Elad Sagi; Mario Svirsky
Journal:  J Acoust Soc Am       Date:  2016-01       Impact factor: 1.840

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.