| Literature DB >> 22341555 |
Michael Grosvald1, Eva Gutierrez, Sarah Hafer, David Corina.
Abstract
A fundamental advance in our understanding of human language would come from a detailed account of how non-linguistic and linguistic manual actions are differentiated in real time by language users. To explore this issue, we targeted the N400, an ERP component known to be sensitive to semantic context. Deaf signers saw 120 American Sign Language sentences, each consisting of a "frame" (a sentence without the last word; e.g. BOY SLEEP IN HIS) followed by a "last item" belonging to one of four categories: a high-close-probability sign (a "semantically reasonable" completion to the sentence; e.g. BED), a low-close-probability sign (a real sign that is nonetheless a "semantically odd" completion to the sentence; e.g. LEMON), a pseudo-sign (phonologically legal but non-lexical form), or a non-linguistic grooming gesture (e.g. the performer scratching her face). We found significant N400-like responses in the incongruent and pseudo-sign contexts, while the gestures elicited a large positivity. Copyright ÂEntities:
Mesh:
Year: 2012 PMID: 22341555 PMCID: PMC3337787 DOI: 10.1016/j.bandl.2012.01.005
Source DB: PubMed Journal: Brain Lang ISSN: 0093-934X Impact factor: 2.381