Literature DB >> 19624028

Optical phonetics and visual perception of lexical and phrasal stress in English.

Rebecca Scarborough1, Patricia Keating, Sven L Mattys, Taehong Cho, Abeer Alwan.   

Abstract

In a study of optical cues to the visual perception of stress, three American English talkers spoke words that differed in lexical stress and sentences that differed in phrasal stress, while video and movements of the face were recorded. The production of stressed and unstressed syllables from these utterances was analyzed along many measures of facial movement, which were generally larger and faster in the stressed condition. In a visual perception experiment, 16 perceivers identified the location of stress in forced-choice judgments of video clips of these utterances (without audio). Phrasal stress was better perceived than lexical stress. The relation of the visual intelligibility of the prosody of these utterances to the optical characteristics of their production was analyzed to determine which cues are associated with successful visual perception. While most optical measures were correlated with perception performance, chin measures, especially Chin Opening Displacement, contributed the most to correct perception independently of the other measures. Thus, our results indicate that the information for visual stress perception is mainly associated with mouth opening movements.

Entities:  

Mesh:

Year:  2009        PMID: 19624028     DOI: 10.1177/0023830909103165

Source DB:  PubMed          Journal:  Lang Speech        ISSN: 0023-8309            Impact factor:   1.500


  11 in total

Review 1.  Temporal context in speech processing and attentional stream selection: a behavioral and neural perspective.

Authors:  Elana M Zion Golumbic; David Poeppel; Charles E Schroeder
Journal:  Brain Lang       Date:  2012-01-29       Impact factor: 2.381

2.  English Listeners Use Suprasegmental Cues to Lexical Stress Early During Spoken-Word Recognition.

Authors:  Alexandra Jesse; Katja Poellmann; Ying-Yee Kong
Journal:  J Speech Lang Hear Res       Date:  2017-01-01       Impact factor: 2.297

3.  Indexing head movement during speech production using optical markers.

Authors:  Kevin D Roon; Katherine M Dawson; Mark K Tiede; D H Whalen
Journal:  J Acoust Soc Am       Date:  2016-05       Impact factor: 1.840

4.  Experiments on Auditory-Visual Perception of Sentences by Users of Unilateral, Bimodal, and Bilateral Cochlear Implants.

Authors:  Michael F Dorman; Julie Liss; Shuai Wang; Visar Berisha; Cimarron Ludwig; Sarah Cook Natale
Journal:  J Speech Lang Hear Res       Date:  2016-12-01       Impact factor: 2.297

5.  Looking Behavior and Audiovisual Speech Understanding in Children With Normal Hearing and Children With Mild Bilateral or Unilateral Hearing Loss.

Authors:  Dawna E Lewis; Nicholas A Smith; Jody L Spalding; Daniel L Valente
Journal:  Ear Hear       Date:  2018 Jul/Aug       Impact factor: 3.570

Review 6.  Multisensory Integration in Cochlear Implant Recipients.

Authors:  Ryan A Stevenson; Sterling W Sheffield; Iliza M Butera; René H Gifford; Mark T Wallace
Journal:  Ear Hear       Date:  2017 Sep/Oct       Impact factor: 3.570

7.  Visual input enhances selective speech envelope tracking in auditory cortex at a "cocktail party".

Authors:  Elana Zion Golumbic; Gregory B Cogan; Charles E Schroeder; David Poeppel
Journal:  J Neurosci       Date:  2013-01-23       Impact factor: 6.167

8.  Perceptual assimilation of lexical tone: the roles of language experience and visual information.

Authors:  Amanda Reid; Denis Burnham; Benjawan Kasisopa; Ronan Reilly; Virginie Attina; Nan Xu Rattanasone; Catherine T Best
Journal:  Atten Percept Psychophys       Date:  2015-02       Impact factor: 2.199

9.  Finding Phrases: The Interplay of Word Frequency, Phrasal Prosody and Co-speech Visual Information in Chunking Speech by Monolingual and Bilingual Adults.

Authors:  Irene de la Cruz-Pavía; Janet F Werker; Eric Vatikiotis-Bateson; Judit Gervain
Journal:  Lang Speech       Date:  2019-04-19       Impact factor: 1.500

10.  Finding phrases: On the role of co-verbal facial information in learning word order in infancy.

Authors:  Irene de la Cruz-Pavía; Judit Gervain; Eric Vatikiotis-Bateson; Janet F Werker
Journal:  PLoS One       Date:  2019-11-11       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.