Literature DB >> 20933062

A common neural system is activated in hearing non-signers to process French sign language and spoken French.

Cyril Courtin1, Gael Jobard, Mathieu Vigneau, Virginie Beaucousin, Annick Razafimandimby, Pierre-Yves Hervé, Emmanuel Mellet, Laure Zago, Laurent Petit, Bernard Mazoyer, Nathalie Tzourio-Mazoyer.   

Abstract

We used functional magnetic resonance imaging to investigate the areas activated by signed narratives in non-signing subjects naïve to sign language (SL) and compared it to the activation obtained when hearing speech in their mother tongue. A subset of left hemisphere (LH) language areas activated when participants watched an audio-visual narrative in their mother tongue was activated when they observed a signed narrative. The inferior frontal (IFG) and precentral (Prec) gyri, the posterior parts of the planum temporale (pPT) and of the superior temporal sulcus (pSTS), and the occipito-temporal junction (OTJ) were activated by both languages. The activity of these regions was not related to the presence of communicative intent because no such changes were observed when the non-signers watched a muted video of a spoken narrative. Recruitment was also not triggered by the linguistic structure of SL, because the areas, except pPT, were not activated when subjects listened to an unknown spoken language. The comparison of brain reactivity for spoken and sign languages shows that SL has a special status in the brain compared to speech; in contrast to unknown oral language, the neural correlates of SL overlap LH speech comprehension areas in non-signers. These results support the idea that strong relationships exist between areas involved in human action observation and language, suggesting that the observation of hand gestures have shaped the lexico-semantic language areas as proposed by the motor theory of speech. As a whole, the present results support the theory of a gestural origin of language.
Copyright © 2010 Elsevier Inc. All rights reserved.

Entities:  

Mesh:

Year:  2010        PMID: 20933062     DOI: 10.1016/j.brainresbull.2010.09.013

Source DB:  PubMed          Journal:  Brain Res Bull        ISSN: 0361-9230            Impact factor:   4.077


  6 in total

1.  Simultaneous perception of a spoken and a signed language: The brain basis of ASL-English code-blends.

Authors:  Jill Weisberg; Stephen McCullough; Karen Emmorey
Journal:  Brain Lang       Date:  2015-07-10       Impact factor: 2.381

2.  Graph theoretical analysis of functional network for comprehension of sign language.

Authors:  Lanfang Liu; Xin Yan; Jin Liu; Mingrui Xia; Chunming Lu; Karen Emmorey; Mingyuan Chu; Guosheng Ding
Journal:  Brain Res       Date:  2017-07-06       Impact factor: 3.252

3.  How sensory-motor systems impact the neural organization for language: direct contrasts between spoken and signed language.

Authors:  Karen Emmorey; Stephen McCullough; Sonya Mehta; Thomas J Grabowski
Journal:  Front Psychol       Date:  2014-05-27

4.  Associations Between Sign Language Skills and Resting-State Functional Connectivity in Deaf Early Signers.

Authors:  Emil Holmer; Krister Schönström; Josefine Andin
Journal:  Front Psychol       Date:  2022-03-18

5.  Perceived Conventionality in Co-speech Gestures Involves the Fronto-Temporal Language Network.

Authors:  Dhana Wolf; Linn-Marlen Rekittke; Irene Mittelberg; Martin Klasen; Klaus Mathiak
Journal:  Front Hum Neurosci       Date:  2017-11-30       Impact factor: 3.169

6.  Multimodal imaging of brain reorganization in hearing late learners of sign language.

Authors:  Anna Banaszkiewicz; Jacek Matuszewski; Łukasz Bola; Michał Szczepanik; Bartosz Kossowski; Paweł Rutkowski; Marcin Szwed; Karen Emmorey; Katarzyna Jednoróg; Artur Marchewka
Journal:  Hum Brain Mapp       Date:  2020-10-24       Impact factor: 5.399

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.