Literature DB >> 29252979

Looking Behavior and Audiovisual Speech Understanding in Children With Normal Hearing and Children With Mild Bilateral or Unilateral Hearing Loss.

Dawna E Lewis1, Nicholas A Smith1, Jody L Spalding1, Daniel L Valente1,2.   

Abstract

OBJECTIVES: Visual information from talkers facilitates speech intelligibility for listeners when audibility is challenged by environmental noise and hearing loss. Less is known about how listeners actively process and attend to visual information from different talkers in complex multi-talker environments. This study tracked looking behavior in children with normal hearing (NH), mild bilateral hearing loss (MBHL), and unilateral hearing loss (UHL) in a complex multi-talker environment to examine the extent to which children look at talkers and whether looking patterns relate to performance on a speech-understanding task. It was hypothesized that performance would decrease as perceptual complexity increased and that children with hearing loss would perform more poorly than their peers with NH. Children with MBHL or UHL were expected to demonstrate greater attention to individual talkers during multi-talker exchanges, indicating that they were more likely to attempt to use visual information from talkers to assist in speech understanding in adverse acoustics. It also was of interest to examine whether MBHL, versus UHL, would differentially affect performance and looking behavior.
DESIGN: Eighteen children with NH, eight children with MBHL, and 10 children with UHL participated (8-12 years). They followed audiovisual instructions for placing objects on a mat under three conditions: a single talker providing instructions via a video monitor, four possible talkers alternately providing instructions on separate monitors in front of the listener, and the same four talkers providing both target and nontarget information. Multi-talker background noise was presented at a 5 dB signal-to-noise ratio during testing. An eye tracker monitored looking behavior while children performed the experimental task.
RESULTS: Behavioral task performance was higher for children with NH than for either group of children with hearing loss. There were no differences in performance between children with UHL and children with MBHL. Eye-tracker analysis revealed that children with NH looked more at the screens overall than did children with MBHL or UHL, though individual differences were greater in the groups with hearing loss. Listeners in all groups spent a small proportion of time looking at relevant screens as talkers spoke. Although looking was distributed across all screens, there was a bias toward the right side of the display. There was no relationship between overall looking behavior and performance on the task.
CONCLUSIONS: The present study examined the processing of audiovisual speech in the context of a naturalistic task. Results demonstrated that children distributed their looking to a variety of sources during the task, but that children with NH were more likely to look at screens than were those with MBHL/UHL. However, all groups looked at the relevant talkers as they were speaking only a small proportion of the time. Despite variability in looking behavior, listeners were able to follow the audiovisual instructions and children with NH demonstrated better performance than children with MBHL/UHL. These results suggest that performance on some challenging multi-talker audiovisual tasks is not dependent on visual fixation to relevant talkers for children with NH or with MBHL/UHL.

Entities:  

Mesh:

Year:  2018        PMID: 29252979      PMCID: PMC6003828          DOI: 10.1097/AUD.0000000000000534

Source DB:  PubMed          Journal:  Ear Hear        ISSN: 0196-0202            Impact factor:   3.570


  70 in total

1.  Speech intelligibility of young school-aged children in the presence of real-life classroom noise.

Authors:  Donald G Jamieson; Garry Kranjc; Karen Yu; William E Hodgetts
Journal:  J Am Acad Audiol       Date:  2004 Jul-Aug       Impact factor: 1.664

2.  The effect of varying talker identity and listening conditions on gaze behavior during audiovisual speech perception.

Authors:  Julie N Buchan; Martin Paré; Kevin G Munhall
Journal:  Brain Res       Date:  2008-06-28       Impact factor: 3.252

3.  Visual information can hinder working memory processing of speech.

Authors:  Sushmit Mishra; Thomas Lunner; Stefan Stenfelt; Jerker Rönnberg; Mary Rudner
Journal:  J Speech Lang Hear Res       Date:  2013-06-19       Impact factor: 2.297

4.  The development of multisensory speech perception continues into the late childhood years.

Authors:  Lars A Ross; Sophie Molholm; Daniella Blanco; Manuel Gomez-Ramirez; Dave Saint-Amour; John J Foxe
Journal:  Eur J Neurosci       Date:  2011-05-25       Impact factor: 3.386

5.  Unilateral sensorineural hearing loss in children and auditory performance with respect to right/left ear differences.

Authors:  J Hartvig Jensen; P A Johansen; S Børre
Journal:  Br J Audiol       Date:  1989-08

6.  Audiovisual integration of speech falters under high attention demands.

Authors:  Agnès Alsius; Jordi Navarra; Ruth Campbell; Salvador Soto-Faraco
Journal:  Curr Biol       Date:  2005-05-10       Impact factor: 10.834

7.  How children and adults produce and perceive uncertainty in audiovisual speech.

Authors:  Emiel Krahmer; Marc Swerts
Journal:  Lang Speech       Date:  2005       Impact factor: 1.500

8.  When half a face is as good as a whole: effects of simple substantial occlusion on visual and audiovisual speech perception.

Authors:  Timothy R Jordan; Sharon M Thomas
Journal:  Atten Percept Psychophys       Date:  2011-10       Impact factor: 2.199

9.  Audiovisual integration in children listening to spectrally degraded speech.

Authors:  David W Maidment; Hi Jee Kang; Hannah J Stewart; Sygal Amitay
Journal:  J Speech Lang Hear Res       Date:  2015-02       Impact factor: 2.297

10.  Parsing eye-tracking data of variable quality to provide accurate fixation duration estimates in infants and adults.

Authors:  S V Wass; T J Smith; M H Johnson
Journal:  Behav Res Methods       Date:  2013-03
View more
  1 in total

1.  Face Masks Impact Auditory and Audiovisual Consonant Recognition in Children With and Without Hearing Loss.

Authors:  Kaylah Lalonde; Emily Buss; Margaret K Miller; Lori J Leibold
Journal:  Front Psychol       Date:  2022-05-13
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.