Literature DB >> 29134576

Comparing speech and nonspeech context effects across timescales in coarticulatory contexts.

Navin Viswanathan1, Damian G Kelty-Stephen2.   

Abstract

Context effects are ubiquitous in speech perception and reflect the ability of human listeners to successfully perceive highly variable speech signals. In the study of how listeners compensate for coarticulatory variability, past studies have used similar effects speech and tone analogues of speech as strong support for speech-neutral, general auditory mechanisms for compensation for coarticulation. In this manuscript, we revisit compensation for coarticulation by replacing standard button-press responses with mouse-tracking responses and examining both standard geometric measures of uncertainty as well as newer information-theoretic measures that separate fast from slow mouse movements. We found that when our analyses were restricted to end-state responses, tones and speech contexts appeared to produce similar effects. However, a more detailed time-course analysis revealed systematic differences between speech and tone contexts such that listeners' responses to speech contexts, but not to tone contexts, changed across the experimental session. Analyses of the time course of effects within trials using mouse tracking indicated that speech contexts elicited fewer x-position flips but more area under the curve (AUC) and maximum deviation (MD), and they did so in the slower portions of mouse-tracking movements. Our results indicate critical differences between the time course of speech and nonspeech context effects and that general auditory explanations, motivated by their apparent similarity, be reexamined.

Entities:  

Keywords:  Similarity; Speech perception; Temporal processing

Mesh:

Year:  2018        PMID: 29134576      PMCID: PMC5800974          DOI: 10.3758/s13414-017-1449-8

Source DB:  PubMed          Journal:  Atten Percept Psychophys        ISSN: 1943-3921            Impact factor:   2.199


  20 in total

1.  The role of feedback information for calibration and attunement in perceiving length by dynamic touch.

Authors:  Rob Withagen; Claire F Michaels
Journal:  J Exp Psychol Hum Percept Perform       Date:  2005-12       Impact factor: 3.332

2.  Putting phonetic context effects into context: a commentary on Fowler (2006).

Authors:  Andrew J Lotto; Lori L Holt
Journal:  Percept Psychophys       Date:  2006-02

3.  General contrast effects in speech perception: effect of preceding liquid on stop consonant identification.

Authors:  A J Lotto; K R Kluender
Journal:  Percept Psychophys       Date:  1998-05

4.  Analyzing spatial data from mouse tracker methodology: An entropic approach.

Authors:  Antonio Calcagnì; Luigi Lombardi; Simone Sulpizio
Journal:  Behav Res Methods       Date:  2017-12

5.  Influence of preceding liquid on stop-consonant perception.

Authors:  V A Mann
Journal:  Percept Psychophys       Date:  1980-11

6.  Similar response patterns do not imply identical origins: an energetic masking account of nonspeech effects in compensation for coarticulation.

Authors:  Navin Viswanathan; James S Magnuson; Carol A Fowler
Journal:  J Exp Psychol Hum Percept Perform       Date:  2012-11-12       Impact factor: 3.332

7.  Quantitative characterization of functional anatomical contributions to cognitive control under uncertainty.

Authors:  Jin Fan; Nicholas T Van Dam; Xiaosi Gu; Xun Liu; Hongbin Wang; Cheuk Y Tang; Patrick R Hof
Journal:  J Cogn Neurosci       Date:  2014-01-06       Impact factor: 3.225

8.  Information for coarticulation: Static signal properties or formant dynamics?

Authors:  Navin Viswanathan; James S Magnuson; Carol A Fowler
Journal:  J Exp Psychol Hum Percept Perform       Date:  2014-04-14       Impact factor: 3.332

9.  Compensation for visually specified coarticulation in liquid-stop contexts.

Authors:  Navin Viswanathan; Joseph D W Stephens
Journal:  Atten Percept Psychophys       Date:  2016-11       Impact factor: 2.199

10.  Constraints on the processes responsible for the extrinsic normalization of vowels.

Authors:  Matthias J Sjerps; Holger Mitterer; James M McQueen
Journal:  Atten Percept Psychophys       Date:  2011-05       Impact factor: 2.199

View more
  2 in total

1.  Perceiving and remembering speech depend on multifractal nonlinearity in movements producing and exploring speech.

Authors:  Lauren Bloomfield; Elizabeth Lane; Madhur Mangalam; Damian G Kelty-Stephen
Journal:  J R Soc Interface       Date:  2021-08-04       Impact factor: 4.293

2.  Bringing the Nonlinearity of the Movement System to Gestural Theories of Language Use: Multifractal Structure of Spoken English Supports the Compensation for Coarticulation in Human Speech Perception.

Authors:  Rachel M Ward; Damian G Kelty-Stephen
Journal:  Front Physiol       Date:  2018-09-03       Impact factor: 4.566

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.