Literature DB >> 24749935

Visual speech acts differently than lexical context in supporting speech perception.

Arthur G Samuel1, Jerrold Lieblich2.   

Abstract

The speech signal is often badly articulated, and heard under difficult listening conditions. To deal with these problems, listeners make use of various types of context. In the current study, we examine a type of context that in previous work has been shown to affect how listeners report what they hear: visual speech (i.e., the visible movements of the speaker's articulators). Despite the clear utility of this type of context under certain conditions, prior studies have shown that visually driven phonetic percepts (via the "McGurk" effect) are not "real" enough to affect perception of later-occurring speech; such percepts have not produced selective adaptation effects. This failure contrasts with successful adaptation by sounds that are generated by lexical context-the word that a sound occurs within. We demonstrate here that this dissociation is robust, leading to the conclusion that visual and lexical contexts operate differently. We suggest that the dissociation reflects the dual nature of speech as both a perceptual object and a linguistic object. Visual speech seems to contribute directly to the computations of the perceptual object but not the linguistic one, while lexical context is used in both types of computations.

Entities:  

Mesh:

Year:  2014        PMID: 24749935      PMCID: PMC4122614          DOI: 10.1037/a0036656

Source DB:  PubMed          Journal:  J Exp Psychol Hum Percept Perform        ISSN: 0096-1523            Impact factor:   3.332


  31 in total

1.  Perception of /r/ and /l/ in a stop cluster: evidence of cross-modal context effects.

Authors:  K P Green; L W Norrix
Journal:  J Exp Psychol Hum Percept Perform       Date:  2001-02       Impact factor: 3.332

2.  The time-limited influence of sentential context on function word identification.

Authors:  P van Alphen; J M McQueen
Journal:  J Exp Psychol Hum Percept Perform       Date:  2001-10       Impact factor: 3.332

3.  Visual recalibration of auditory speech identification: a McGurk aftereffect.

Authors:  Paul Bertelson; Jean Vroomen; Béatrice De Gelder
Journal:  Psychol Sci       Date:  2003-11

4.  Electrophysiological indicators of phonetic and non-phonetic multisensory interactions during audiovisual speech perception.

Authors:  Vasily Klucharev; Riikka Möttönen; Mikko Sams
Journal:  Brain Res Cogn Brain Res       Date:  2003-12

5.  Lexical influences in audiovisual speech perception.

Authors:  Lawrence Brancazio; Lawrence Brancazio
Journal:  J Exp Psychol Hum Percept Perform       Date:  2004-06       Impact factor: 3.332

6.  Bimodal speech: early suppressive visual effects in human auditory cortex.

Authors:  Julien Besle; Alexandra Fort; Claude Delpuech; Marie-Hélène Giard
Journal:  Eur J Neurosci       Date:  2004-10       Impact factor: 3.386

7.  Knowing a word affects the fundamental perception of the sounds within it.

Authors:  A G Samuel
Journal:  Psychol Sci       Date:  2001-07

Review 8.  Perception of the speech code.

Authors:  A M Liberman; F S Cooper; D P Shankweiler; M Studdert-Kennedy
Journal:  Psychol Rev       Date:  1967-11       Impact factor: 8.934

9.  Perceptual restoration of missing speech sounds.

Authors:  R M Warren
Journal:  Science       Date:  1970-01-23       Impact factor: 47.728

10.  A neural basis for interindividual differences in the McGurk effect, a multisensory speech illusion.

Authors:  Audrey R Nath; Michael S Beauchamp
Journal:  Neuroimage       Date:  2011-07-20       Impact factor: 6.556

View more
  8 in total

1.  What you see isn't always what you get: Auditory word signals trump consciously perceived words in lexical access.

Authors:  Rachel Ostrand; Sheila E Blumstein; Victor S Ferreira; James L Morgan
Journal:  Cognition       Date:  2016-03-21

2.  The Role of Auditory and Visual Speech in Word Learning at 18 Months and in Adulthood.

Authors:  Mélanie Havy; Afra Foroud; Laurel Fais; Janet F Werker
Journal:  Child Dev       Date:  2017-01-26

3.  Influences of selective adaptation on perception of audiovisual speech.

Authors:  James W Dias; Theresa C Cook; Lawrence D Rosenblum
Journal:  J Phon       Date:  2016-05

4.  Tolerance for audiovisual asynchrony is enhanced by the spectrotemporal fidelity of the speaker's mouth movements and speech.

Authors:  Antoine J Shahin; Stanley Shen; Jess R Kerlin
Journal:  Lang Cogn Neurosci       Date:  2017-02-06       Impact factor: 2.331

5.  Learning Spoken Words via the Ears and Eyes: Evidence from 30-Month-Old Children.

Authors:  Mélanie Havy; Pascal Zesiger
Journal:  Front Psychol       Date:  2017-12-08

6.  Rapid recalibration of speech perception after experiencing the McGurk illusion.

Authors:  Claudia S Lüttke; Alexis Pérez-Bellido; Floris P de Lange
Journal:  R Soc Open Sci       Date:  2018-03-28       Impact factor: 2.963

7.  Audiovisual spoken word training can promote or impede auditory-only perceptual learning: prelingually deafened adults with late-acquired cochlear implants versus normal hearing adults.

Authors:  Lynne E Bernstein; Silvio P Eberhardt; Edward T Auer
Journal:  Front Psychol       Date:  2014-08-26

8.  A Selective Deficit in Phonetic Recalibration by Text in Developmental Dyslexia.

Authors:  Mirjam Keetels; Milene Bonte; Jean Vroomen
Journal:  Front Psychol       Date:  2018-05-15
  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.