Literature DB >> 31592903

Detection and Attention for Auditory, Visual, and Audiovisual Speech in Children with Hearing Loss.

Susan Jerger1,2, Markus F Damian3, Cassandra Karl1,2, Hervé Abdi1.   

Abstract

OBJECTIVES: Efficient multisensory speech detection is critical for children who must quickly detect/encode a rapid stream of speech to participate in conversations and have access to the audiovisual cues that underpin speech and language development, yet multisensory speech detection remains understudied in children with hearing loss (CHL). This research assessed detection, along with vigilant/goal-directed attention, for multisensory versus unisensory speech in CHL versus children with normal hearing (CNH).
DESIGN: Participants were 60 CHL who used hearing aids and communicated successfully aurally/orally and 60 age-matched CNH. Simple response times determined how quickly children could detect a preidentified easy-to-hear stimulus (70 dB SPL, utterance "buh" presented in auditory only [A], visual only [V], or audiovisual [AV] modes). The V mode formed two facial conditions: static versus dynamic face. Faster detection for multisensory (AV) than unisensory (A or V) input indicates multisensory facilitation. We assessed mean responses and faster versus slower responses (defined by first versus third quartiles of response-time distributions), which were respectively conceptualized as: faster responses (first quartile) reflect efficient detection with efficient vigilant/goal-directed attention and slower responses (third quartile) reflect less efficient detection associated with attentional lapses. Finally, we studied associations between these results and personal characteristics of CHL.
RESULTS: Unisensory A versus V modes: Both groups showed better detection and attention for A than V input. The A input more readily captured children's attention and minimized attentional lapses, which supports A-bound processing even by CHL who were processing low fidelity A input. CNH and CHL did not differ in ability to detect A input at conversational speech level. Multisensory AV versus A modes: Both groups showed better detection and attention for AV than A input. The advantage for AV input was facial effect (both static and dynamic faces), a pattern suggesting that communication is a social interaction that is more than just words. Attention did not differ between groups; detection was faster in CHL than CNH for AV input, but not for A input. Associations between personal characteristics/degree of hearing loss of CHL and results: CHL with greatest deficits in detection of V input had poorest word recognition skills and CHL with greatest reduction of attentional lapses from AV input had poorest vocabulary skills. Both outcomes are consistent with the idea that CHL who are processing low fidelity A input depend disproportionately on V and AV input to learn to identify words and associate them with concepts. As CHL aged, attention to V input improved. Degree of HL did not influence results.
CONCLUSIONS: Understanding speech-a daily challenge for CHL-is a complex task that demands efficient detection of and attention to AV speech cues. Our results support the clinical importance of multisensory approaches to understand and advance spoken communication by CHL.

Entities:  

Year:  2020        PMID: 31592903      PMCID: PMC7136139          DOI: 10.1097/AUD.0000000000000798

Source DB:  PubMed          Journal:  Ear Hear        ISSN: 0196-0202            Impact factor:   3.570


  52 in total

1.  Phonological processing, language, and literacy: a comparison of children with mild-to-moderate sensorineural hearing loss and those with specific language impairment.

Authors:  J Briscoe; D V Bishop; C F Norbury
Journal:  J Child Psychol Psychiatry       Date:  2001-03       Impact factor: 8.982

2.  Validation of reaction time in continuous performance tasks as an index of attention by electrophysiological measures.

Authors:  I Reinvang
Journal:  J Clin Exp Neuropsychol       Date:  1998-12       Impact factor: 2.475

3.  Specific auditory perceptual dysfunction in a learning disabled child.

Authors:  S Jerger; R C Martin; J Jerger
Journal:  Ear Hear       Date:  1987-04       Impact factor: 3.570

4.  Dividing attention between color and shape: evidence of coactivation.

Authors:  J T Mordkoff; S Yantis
Journal:  Percept Psychophys       Date:  1993-04

5.  Visual speech alters the discrimination and identification of non-intact auditory speech in children with hearing loss.

Authors:  Susan Jerger; Markus F Damian; Rachel P McAlpine; Hervé Abdi
Journal:  Int J Pediatr Otorhinolaryngol       Date:  2017-01-09       Impact factor: 1.675

Review 6.  Multisensory Integration in Cochlear Implant Recipients.

Authors:  Ryan A Stevenson; Sterling W Sheffield; Iliza M Butera; René H Gifford; Mark T Wallace
Journal:  Ear Hear       Date:  2017 Sep/Oct       Impact factor: 3.570

7.  Sustained attention and prediction: distinct brain maturation trajectories during adolescence.

Authors:  Alix Thillay; Sylvie Roux; Valérie Gissot; Isabelle Carteau-Martin; Robert T Knight; Frédérique Bonnet-Brilhault; Aurélie Bidet-Caulet
Journal:  Front Hum Neurosci       Date:  2015-09-24       Impact factor: 3.169

Review 8.  Assessing the Role of the 'Unity Assumption' on Multisensory Integration: A Review.

Authors:  Yi-Chuan Chen; Charles Spence
Journal:  Front Psychol       Date:  2017-03-31

9.  Application of the ex-Gaussian function to the effect of the word blindness suggestion on Stroop task performance suggests no word blindness.

Authors:  Benjamin A Parris; Zoltan Dienes; Timothy L Hodgson
Journal:  Front Psychol       Date:  2013-09-20

10.  Eye Movements During Visual Speech Perception in Deaf and Hearing Children.

Authors:  Elizabeth Worster; Hannah Pimperton; Amelia Ralph-Lewis; Laura Monroy; Charles Hulme; Mairéad MacSweeney
Journal:  Lang Learn       Date:  2017-09-26
View more
  2 in total

1.  Auditory experience modulates fronto-parietal theta activity serving fluid intelligence.

Authors:  Elizabeth Heinrichs-Graham; Elizabeth A Walker; Brittany K Taylor; Sophia C Menting; Jacob A Eastman; Michaela R Frenzel; Ryan W McCreery
Journal:  Brain Commun       Date:  2022-04-05

2.  Amount of Hearing Aid Use Impacts Neural Oscillatory Dynamics Underlying Verbal Working Memory Processing for Children With Hearing Loss.

Authors:  Elizabeth Heinrichs-Graham; Elizabeth A Walker; Jacob A Eastman; Michaela R Frenzel; Ryan W McCreery
Journal:  Ear Hear       Date:  2022 Mar/Apr       Impact factor: 3.562

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.