Literature DB >> 14738521

Visual prosody and speech intelligibility: head movement improves auditory speech perception.

K G Munhall1, Jeffery A Jones, Daniel E Callan, Takaaki Kuratate, Eric Vatikiotis-Bateson.   

Abstract

People naturally move their heads when they speak, and our study shows that this rhythmic head motion conveys linguistic information. Three-dimensional head and face motion and the acoustics of a talker producing Japanese sentences were recorded and analyzed. The head movement correlated strongly with the pitch (fundamental frequency) and amplitude of the talker's voice. In a perception study, Japanese subjects viewed realistic talking-head animations based on these movement recordings in a speech-in-noise task. The animations allowed the head motion to be manipulated without changing other characteristics of the visual or acoustic speech. Subjects correctly identified more syllables when natural head motion was present in the animation than when it was eliminated or distorted. These results suggest that nonverbal gestures such as head movements play a more direct role in the perception of speech than previously known.

Entities:  

Mesh:

Year:  2004        PMID: 14738521     DOI: 10.1111/j.0963-7214.2004.01502010.x

Source DB:  PubMed          Journal:  Psychol Sci        ISSN: 0956-7976


  80 in total

1.  Seeing a singer helps comprehension of the song's lyrics.

Authors:  Alexandra Jesse; Dominic W Massaro
Journal:  Psychon Bull Rev       Date:  2010-06

Review 2.  Temporal context in speech processing and attentional stream selection: a behavioral and neural perspective.

Authors:  Elana M Zion Golumbic; David Poeppel; Charles E Schroeder
Journal:  Brain Lang       Date:  2012-01-29       Impact factor: 2.381

Review 3.  The processing of audio-visual speech: empirical and neural bases.

Authors:  Ruth Campbell
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2008-03-12       Impact factor: 6.237

Review 4.  Neuronal oscillations and visual amplification of speech.

Authors:  Charles E Schroeder; Peter Lakatos; Yoshinao Kajikawa; Sarah Partan; Aina Puce
Journal:  Trends Cogn Sci       Date:  2008-02-15       Impact factor: 20.229

5.  Seeing pitch: visual information for lexical tones of Mandarin-Chinese.

Authors:  Trevor H Chen; Dominic W Massaro
Journal:  J Acoust Soc Am       Date:  2008-04       Impact factor: 1.840

6.  Leveraging audiovisual speech perception to measure anticipatory coarticulation.

Authors:  Melissa A Redford; Jeffrey E Kallay; Sergei V Bogdanov; Eric Vatikiotis-Bateson
Journal:  J Acoust Soc Am       Date:  2018-10       Impact factor: 1.840

7.  Entrained neural oscillations in multiple frequency bands comodulate behavior.

Authors:  Molly J Henry; Björn Herrmann; Jonas Obleser
Journal:  Proc Natl Acad Sci U S A       Date:  2014-09-29       Impact factor: 11.205

8.  Free viewing of talking faces reveals mouth and eye preferring regions of the human superior temporal sulcus.

Authors:  Johannes Rennig; Michael S Beauchamp
Journal:  Neuroimage       Date:  2018-08-06       Impact factor: 6.556

9.  Eye Can Hear Clearly Now: Inverse Effectiveness in Natural Audiovisual Speech Processing Relies on Long-Term Crossmodal Temporal Integration.

Authors:  Michael J Crosse; Giovanni M Di Liberto; Edmund C Lalor
Journal:  J Neurosci       Date:  2016-09-21       Impact factor: 6.167

10.  The natural statistics of audiovisual speech.

Authors:  Chandramouli Chandrasekaran; Andrea Trubanova; Sébastien Stillittano; Alice Caplier; Asif A Ghazanfar
Journal:  PLoS Comput Biol       Date:  2009-07-17       Impact factor: 4.475

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.