Literature DB >> 28559320

Visual cortex entrains to sign language.

Geoffrey Brookshire1, Jenny Lu2, Howard C Nusbaum2,3, Susan Goldin-Meadow2,4, Daniel Casasanto1,3.   

Abstract

Despite immense variability across languages, people can learn to understand any human language, spoken or signed. What neural mechanisms allow people to comprehend language across sensory modalities? When people listen to speech, electrophysiological oscillations in auditory cortex entrain to slow ([Formula: see text]8 Hz) fluctuations in the acoustic envelope. Entrainment to the speech envelope may reflect mechanisms specialized for auditory perception. Alternatively, flexible entrainment may be a general-purpose cortical mechanism that optimizes sensitivity to rhythmic information regardless of modality. Here, we test these proposals by examining cortical coherence to visual information in sign language. First, we develop a metric to quantify visual change over time. We find quasiperiodic fluctuations in sign language, characterized by lower frequencies than fluctuations in speech. Next, we test for entrainment of neural oscillations to visual change in sign language, using electroencephalography (EEG) in fluent speakers of American Sign Language (ASL) as they watch videos in ASL. We find significant cortical entrainment to visual oscillations in sign language <5 Hz, peaking at [Formula: see text]1 Hz. Coherence to sign is strongest over occipital and parietal cortex, in contrast to speech, where coherence is strongest over the auditory cortex. Nonsigners also show coherence to sign language, but entrainment at frontal sites is reduced relative to fluent signers. These results demonstrate that flexible cortical entrainment to language does not depend on neural processes that are specific to auditory speech perception. Low-frequency oscillatory entrainment may reflect a general cortical mechanism that maximizes sensitivity to informational peaks in time-varying signals.

Entities:  

Keywords:  EEG; cortical entrainment; oscillations; sign language

Mesh:

Year:  2017        PMID: 28559320      PMCID: PMC5474824          DOI: 10.1073/pnas.1620350114

Source DB:  PubMed          Journal:  Proc Natl Acad Sci U S A        ISSN: 0027-8424            Impact factor:   11.205


  52 in total

1.  Language lateralization in a bimanual language.

Authors:  David P Corina; Lucila San Jose-Robertson; Andre Guillemin; Julia High; Allen R Braun
Journal:  J Cogn Neurosci       Date:  2003-07-01       Impact factor: 3.225

2.  An oscillatory hierarchy controlling neuronal excitability and stimulus processing in the auditory cortex.

Authors:  Peter Lakatos; Ankoor S Shah; Kevin H Knuth; Istvan Ulbert; George Karmos; Charles E Schroeder
Journal:  J Neurophysiol       Date:  2005-05-18       Impact factor: 2.714

3.  Phase patterns of neuronal responses reliably discriminate speech in human auditory cortex.

Authors:  Huan Luo; David Poeppel
Journal:  Neuron       Date:  2007-06-21       Impact factor: 17.173

4.  Cineradiography of monkey lip-smacking reveals putative precursors of speech dynamics.

Authors:  Asif A Ghazanfar; Daniel Y Takahashi; Neil Mathur; W Tecumseh Fitch
Journal:  Curr Biol       Date:  2012-05-31       Impact factor: 10.834

5.  The natural statistics of audiovisual speech.

Authors:  Chandramouli Chandrasekaran; Andrea Trubanova; Sébastien Stillittano; Alice Caplier; Asif A Ghazanfar
Journal:  PLoS Comput Biol       Date:  2009-07-17       Impact factor: 4.475

6.  Congruent Visual Speech Enhances Cortical Entrainment to Continuous Auditory Speech in Noise-Free Conditions.

Authors:  Michael J Crosse; John S Butler; Edmund C Lalor
Journal:  J Neurosci       Date:  2015-10-21       Impact factor: 6.167

7.  FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data.

Authors:  Robert Oostenveld; Pascal Fries; Eric Maris; Jan-Mathijs Schoffelen
Journal:  Comput Intell Neurosci       Date:  2010-12-23

8.  Frontal top-down signals increase coupling of auditory low-frequency oscillations to continuous speech in human listeners.

Authors:  Hyojin Park; Robin A A Ince; Philippe G Schyns; Gregor Thut; Joachim Gross
Journal:  Curr Biol       Date:  2015-05-28       Impact factor: 10.834

9.  Lip movements entrain the observers' low-frequency brain oscillations to facilitate speech intelligibility.

Authors:  Hyojin Park; Christoph Kayser; Gregor Thut; Joachim Gross
Journal:  Elife       Date:  2016-05-05       Impact factor: 8.140

10.  Irregular Speech Rate Dissociates Auditory Cortical Entrainment, Evoked Responses, and Frontal Alpha.

Authors:  Stephanie J Kayser; Robin A A Ince; Joachim Gross; Christoph Kayser
Journal:  J Neurosci       Date:  2015-11-04       Impact factor: 6.167

View more
  5 in total

1.  Does gesture strengthen sensorimotor knowledge of objects? The case of the size-weight illusion.

Authors:  Wim Pouw; Stephanie I Wassenburg; Autumn B Hostetter; Bjorn B de Koning; Fred Paas
Journal:  Psychol Res       Date:  2018-12-14

2.  Alteration of Cortical and Subcortical Structures in Children With Profound Sensorineural Hearing Loss.

Authors:  Hang Qu; Hui Tang; Jiahao Pan; Yi Zhao; Wei Wang
Journal:  Front Hum Neurosci       Date:  2020-12-09       Impact factor: 3.169

3.  Low-Frequency Entrainment to Visual Motion Underlies Sign Language Comprehension.

Authors:  E A Malaia; S C Borneman; J Krebs; R B Wilbur
Journal:  IEEE Trans Neural Syst Rehabil Eng       Date:  2021-12-03       Impact factor: 3.802

4.  Predictive Processing in Sign Languages: A Systematic Review.

Authors:  Tomislav Radošević; Evie A Malaia; Marina Milković
Journal:  Front Psychol       Date:  2022-04-14

5.  The quantification of gesture-speech synchrony: A tutorial and validation of multimodal data acquisition using device-based and video-based motion tracking.

Authors:  Wim Pouw; James P Trujillo; James A Dixon
Journal:  Behav Res Methods       Date:  2020-04
  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.