Aaron C Moberly1, Joanna H Lowenstein, Susan Nittrouer. 1. 1Department of Otolaryngology-Head and Neck Surgery, The Ohio State University, Wexner Medical Center, Columbus, Ohio, USA; and 2Speech, Language, and Hearing Sciences, University of Florida, Gainesville, Florida, USA.
Abstract
OBJECTIVES: Cochlear implantation does not automatically result in robust spoken language understanding for postlingually deafened adults. Enormous outcome variability exists, related to the complexity of understanding spoken language through cochlear implants (CIs), which deliver degraded speech representations. This investigation examined variability in word recognition as explained by "perceptual attention" and "auditory sensitivity" to acoustic cues underlying speech perception. DESIGN: Thirty postlingually deafened adults with CIs and 20 age-matched controls with normal hearing (NH) were tested. Participants underwent assessment of word recognition in quiet and perceptual attention (cue-weighting strategies) based on labeling tasks for two phonemic contrasts: (1) "cop"-"cob," based on a duration cue (easily accessible through CIs) or a dynamic spectral cue (less accessible through CIs), and (2) "sa"-"sha," based on static or dynamic spectral cues (both potentially poorly accessible through CIs). Participants were also assessed for auditory sensitivity to the speech cues underlying those labeling decisions. RESULTS: Word recognition varied widely among CI users (20 to 96%), but it was generally poorer than for NH participants. Implant users and NH controls showed similar perceptual attention and auditory sensitivity to the duration cue, while CI users showed poorer attention and sensitivity to all spectral cues. Both attention and sensitivity to spectral cues predicted variability in word recognition. CONCLUSIONS: For CI users, both perceptual attention and auditory sensitivity are important in word recognition. Efforts should be made to better represent spectral cues through implants, while also facilitating attention to these cues through auditory training.
OBJECTIVES: Cochlear implantation does not automatically result in robust spoken language understanding for postlingually deafened adults. Enormous outcome variability exists, related to the complexity of understanding spoken language through cochlear implants (CIs), which deliver degraded speech representations. This investigation examined variability in word recognition as explained by "perceptual attention" and "auditory sensitivity" to acoustic cues underlying speech perception. DESIGN: Thirty postlingually deafened adults with CIs and 20 age-matched controls with normal hearing (NH) were tested. Participants underwent assessment of word recognition in quiet and perceptual attention (cue-weighting strategies) based on labeling tasks for two phonemic contrasts: (1) "cop"-"cob," based on a duration cue (easily accessible through CIs) or a dynamic spectral cue (less accessible through CIs), and (2) "sa"-"sha," based on static or dynamic spectral cues (both potentially poorly accessible through CIs). Participants were also assessed for auditory sensitivity to the speech cues underlying those labeling decisions. RESULTS:Word recognition varied widely among CI users (20 to 96%), but it was generally poorer than for NH participants. Implant users and NH controls showed similar perceptual attention and auditory sensitivity to the duration cue, while CI users showed poorer attention and sensitivity to all spectral cues. Both attention and sensitivity to spectral cues predicted variability in word recognition. CONCLUSIONS: For CI users, both perceptual attention and auditory sensitivity are important in word recognition. Efforts should be made to better represent spectral cues through implants, while also facilitating attention to these cues through auditory training.
Authors: Aaron C Moberly; Joanna H Lowenstein; Eric Tarr; Amanda Caldwell-Tarr; D Bradley Welling; Antoine J Shahin; Susan Nittrouer Journal: J Speech Lang Hear Res Date: 2014-04-01 Impact factor: 2.297
Authors: Maneesha Lakmalie De Silva; Marie Thèresé McLaughlin; Edrich Joseph Rodrigues; Julie Carolyn Broadbent; Andrew Robert Gray; Graeme David Hammond-Tooke Journal: Age Ageing Date: 2008-08-07 Impact factor: 10.668
Authors: Gail S Donaldson; Catherine L Rogers; Emily S Cardenas; Benjamin A Russell; Nada H Hanna Journal: J Acoust Soc Am Date: 2013-10 Impact factor: 1.840
Authors: Laura K Holden; Charles C Finley; Jill B Firszt; Timothy A Holden; Christine Brenner; Lisa G Potts; Brenda D Gotter; Sallie S Vanderhoof; Karen Mispagel; Gitry Heydebrand; Margaret W Skinner Journal: Ear Hear Date: 2013 May-Jun Impact factor: 3.570
Authors: Diane S Lazard; Christophe Vincent; Frédéric Venail; Paul Van de Heyning; Eric Truy; Olivier Sterkers; Piotr H Skarzynski; Henryk Skarzynski; Karen Schauwers; Stephen O'Leary; Deborah Mawman; Bert Maat; Andrea Kleine-Punte; Alexander M Huber; Kevin Green; Paul J Govaerts; Bernard Fraysse; Richard Dowell; Norbert Dillier; Elaine Burke; Andy Beynon; François Bergeron; Deniz Başkent; Françoise Artières; Peter J Blamey Journal: PLoS One Date: 2012-11-09 Impact factor: 3.240
Authors: Efthymia C Kapnoula; Matthew B Winn; Eun Jong Kong; Jan Edwards; Bob McMurray Journal: J Exp Psychol Hum Percept Perform Date: 2017-04-13 Impact factor: 3.332
Authors: David L Horn; Daniel J Dudley; Kavita Dedhia; Kaibao Nie; Ward R Drennan; Jong Ho Won; Jay T Rubinstein; Lynne A Werner Journal: J Acoust Soc Am Date: 2017-01 Impact factor: 1.840
Authors: David B Pisoni; Arthur Broadstock; Taylor Wucinich; Natalie Safdar; Kelly Miller; Luis R Hernandez; Kara Vasil; Lauren Boyce; Alexandra Davies; Michael S Harris; Irina Castellanos; Huiping Xu; William G Kronenberger; Aaron C Moberly Journal: Ear Hear Date: 2018 Jul/Aug Impact factor: 3.570