Literature DB >> 17455074

Tracking recognition of spoken words by tracking looks to printed words.

James M McQueen1, Malte C Viebahn.   

Abstract

Eye movements of Dutch participants were tracked as they looked at arrays of four words on a computer screen and followed spoken instructions (e.g., "Klik op het woord buffel": Click on the word buffalo). The arrays included the target (e.g., buffel), a phonological competitor (e.g., buffer, buffer), and two unrelated distractors. Targets were monosyllabic or bisyllabic, and competitors mismatched targets only on either their onset or offset phoneme and only by one distinctive feature. Participants looked at competitors more than at distractors, but this effect was much stronger for offset-mismatch than onset-mismatch competitors. Fixations to competitors started to decrease as soon as phonetic evidence disfavouring those competitors could influence behaviour. These results confirm that listeners continuously update their interpretation of words as the evidence in the speech signal unfolds and hence establish the viability of the methodology of using eye movements to arrays of printed words to track spoken-word recognition.

Entities:  

Mesh:

Year:  2007        PMID: 17455074     DOI: 10.1080/17470210601183890

Source DB:  PubMed          Journal:  Q J Exp Psychol (Hove)        ISSN: 1747-0218            Impact factor:   2.143


  27 in total

1.  The divided visual world paradigm: eye tracking reveals hemispheric asymmetries in lexical ambiguity resolution.

Authors:  Aaron M Meyer; Kara D Federmeier
Journal:  Brain Res       Date:  2008-05-21       Impact factor: 3.252

2.  Abstract Conceptual Feature Ratings Predict Gaze Within Written Word Arrays: Evidence From a Visual Wor(l)d Paradigm.

Authors:  Silvia Primativo; Jamie Reilly; Sebastian J Crutch
Journal:  Cogn Sci       Date:  2016-02-22

3.  English Listeners Use Suprasegmental Cues to Lexical Stress Early During Spoken-Word Recognition.

Authors:  Alexandra Jesse; Katja Poellmann; Ying-Yee Kong
Journal:  J Speech Lang Hear Res       Date:  2017-01-01       Impact factor: 2.297

4.  Competing speech perception in older and younger adults: behavioral and eye-movement evidence.

Authors:  Karen S Helfer; Adrian Staub
Journal:  Ear Hear       Date:  2014 Mar-Apr       Impact factor: 3.570

5.  Speech-perception training for older adults with hearing loss impacts word recognition and effort.

Authors:  Stefanie E Kuchinsky; Jayne B Ahlstrom; Stephanie L Cute; Larry E Humes; Judy R Dubno; Mark A Eckert
Journal:  Psychophysiology       Date:  2014-06-09       Impact factor: 4.016

6.  The nature of the visual environment induces implicit biases during language-mediated visual search.

Authors:  Falk Huettig; James M McQueen
Journal:  Mem Cognit       Date:  2011-08

7.  Pupil size varies with word listening and response selection difficulty in older adults with hearing loss.

Authors:  Stefanie E Kuchinsky; Jayne B Ahlstrom; Kenneth I Vaden; Stephanie L Cute; Larry E Humes; Judy R Dubno; Mark A Eckert
Journal:  Psychophysiology       Date:  2012-11-15       Impact factor: 4.016

8.  Visual attention shift to printed words during spoken word recognition in Chinese: The role of phonological information.

Authors:  Wei Shen; Qingqing Qu; Xiuhong Tong
Journal:  Mem Cognit       Date:  2018-05

9.  Immediate effects of anticipatory coarticulation in spoken-word recognition.

Authors:  Anne Pier Salverda; Dave Kleinschmidt; Michael K Tanenhaus
Journal:  J Mem Lang       Date:  2014-02-01       Impact factor: 3.059

10.  Using Eye Movements Recorded in the Visual World Paradigm to Explore the Online Processing of Spoken Language.

Authors:  Likan Zhan
Journal:  J Vis Exp       Date:  2018-10-13       Impact factor: 1.355

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.