Literature DB >> 9038409

Cultural and linguistic factors in audiovisual speech processing: the McGurk effect in Chinese subjects.

K Sekiyama1.   

Abstract

The "McGurk effect" demonstrates that visual (lip-read) information is used during speech perception even when it is discrepant with auditory information. While this has been established as a robust effect in subjects from Western cultures, our own earlier results had suggested that Japanese subjects use visual information much less than American subjects do (Sekiyama & Tohkura, 1993). The present study examined whether Chinese subjects would also show a reduced McGurk effect due to their cultural similarities with the Japanese. The subjects were 14 native speakers of Chinese living in Japan. Stimuli consisted of 10 syllable (/ba/, /pa/, /ma/, /wa/, /da/, /ta/, /na/, /ga/, /ka/, /ra/) pronounced by two speakers, one Japanese and one American. Each auditory syllable was dubbed onto every, visual syllable within one speaker, resulting in 100 audiovisual stimuli in each language. The subjects' main task was to report what they thought they had heard while looking at and listening to the speaker while the stimuli were being uttered. Compared with previous results obtained with American subjects, the Chinese subjects showed a weaker McGurk effect. The results also showed that the magnitude of the McGurk effect depends on the length of time the Chinese subjects had lived in Japan. Factors that foster and alter the Chinese subjects' reliance on auditory information are discussed.

Mesh:

Year:  1997        PMID: 9038409     DOI: 10.3758/bf03206849

Source DB:  PubMed          Journal:  Percept Psychophys        ISSN: 0031-5117


  21 in total

1.  Seeing pitch: visual information for lexical tones of Mandarin-Chinese.

Authors:  Trevor H Chen; Dominic W Massaro
Journal:  J Acoust Soc Am       Date:  2008-04       Impact factor: 1.840

2.  Perceptual congruency of audio-visual speech affects ventriloquism with bilateral visual stimuli.

Authors:  Shoko Kanaya; Kazuhiko Yokosawa
Journal:  Psychon Bull Rev       Date:  2011-02

3.  Similar frequency of the McGurk effect in large samples of native Mandarin Chinese and American English speakers.

Authors:  John F Magnotti; Debshila Basu Mallick; Guo Feng; Bin Zhou; Wen Zhou; Michael S Beauchamp
Journal:  Exp Brain Res       Date:  2015-06-04       Impact factor: 1.972

4.  Audiovisual sentence recognition not predicted by susceptibility to the McGurk effect.

Authors:  Kristin J Van Engen; Zilong Xie; Bharath Chandrasekaran
Journal:  Atten Percept Psychophys       Date:  2017-02       Impact factor: 2.199

5.  Psychophysics of the McGurk and other audiovisual speech integration effects.

Authors:  Jintao Jiang; Lynne E Bernstein
Journal:  J Exp Psychol Hum Percept Perform       Date:  2011-08       Impact factor: 3.332

6.  Audiovisual speech perception: A new approach and implications for clinical populations.

Authors:  Julia Irwin; Lori DiBlasi
Journal:  Lang Linguist Compass       Date:  2017-03-26

7.  Multisensory integration of speech signals: the relationship between space and time.

Authors:  Jeffery A Jones; Michelle Jarick
Journal:  Exp Brain Res       Date:  2006-08-10       Impact factor: 1.972

8.  Individual differences and the effect of face configuration information in the McGurk effect.

Authors:  Yuta Ujiie; Tomohisa Asai; Akio Wakabayashi
Journal:  Exp Brain Res       Date:  2018-01-30       Impact factor: 1.972

9.  The noisy encoding of disparity model of the McGurk effect.

Authors:  John F Magnotti; Michael S Beauchamp
Journal:  Psychon Bull Rev       Date:  2015-06

10.  Lip-reading aids word recognition most in moderate noise: a Bayesian explanation using high-dimensional feature space.

Authors:  Wei Ji Ma; Xiang Zhou; Lars A Ross; John J Foxe; Lucas C Parra
Journal:  PLoS One       Date:  2009-03-04       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.