| Literature DB >> 29867655 |
Abstract
Linguistic manual gestures are the basis of sign languages used by deaf individuals. Working memory and language processing are intimately connected and thus when language is gesture-based, it is important to understand related working memory mechanisms. This article reviews work on working memory for linguistic and non-linguistic manual gestures and discusses theoretical and applied implications. Empirical evidence shows that there are effects of load and stimulus degradation on working memory for manual gestures. These effects are similar to those found for working memory for speech-based language. Further, there are effects of pre-existing linguistic representation that are partially similar across language modalities. But above all, deaf signers score higher than hearing non-signers on an n-back task with sign-based stimuli, irrespective of their semantic and phonological content, but not with non-linguistic manual actions. This pattern may be partially explained by recent findings relating to cross-modal plasticity in deaf individuals. It suggests that in linguistic gesture-based working memory, semantic aspects may outweigh phonological aspects when processing takes place under challenging conditions. The close association between working memory and language development should be taken into account in understanding and alleviating the challenges faced by deaf children growing up with cochlear implants as well as other clinical populations.Entities:
Keywords: cochlear implantation; deafness; manual gestures; phonology; semantics; sign language; working memory
Year: 2018 PMID: 29867655 PMCID: PMC5962724 DOI: 10.3389/fpsyg.2018.00679
Source DB: PubMed Journal: Front Psychol ISSN: 1664-1078
Overview of studies using the n-back paradigm to investigate working memory for linguistic and non-linguistic manual gestures.
| Reference | Method | Stimulus material | Sample | Working memory Load (n) | Additional design factors | Results |
|---|---|---|---|---|---|---|
| Behavioral | Video-recorded meaningless manual gestures | 20 hearing non-signers | 1, 2 | Location (proximal, distal) | At low load, gesture proximity reduced performance | |
| Behavioral | Video-recorded lexical signs | 20 hearing non-signers | 1, 2 | Resolution at 5 levels | Low resolution reduced performance more when load was high | |
| fMRI | Video-recorded lexical signs and words | 13 Hearing native signers | 2 | Match (sign-sign, word-word, sign-word) | Cross modal processing, activated posterior regions including the right middle temporal lobe, possibly relating to binding of phonological loop representations with semantic representations in long-term memory | |
| fMRI | Pictures | 11 deaf signers, 20 hearing non-signers | 2 | Semantic, phonological, orthograhic | Poorer behavioral performance on phonological and orthographic than semantic conditions. Distinct neural networks at all 3 levels of linguistic processing modality-specific differences | |
| Behavioral | Video-recorded familiar and unfamiliar signs as well as non-signs and non-linguistic manual actions | 24 deaf signers, 20 hearing signers, 24 hearing non-signers | 1, 2, 3 | Load and material manipulated orthogonally | Hearing signers performed better with familiar than non-familiar signs. Deaf signers also performed better with familiar than non-familiar signs but only when load was high. Deaf signers performed better than hearing non-signers with all materials except non-linguistic manual actions | |
| fMRI | Dynamic point-light displays of lexical signs or nonsense objects | 12 deaf signers, 16 hearing native signers, 16 hearing non-signers | 2 | Attentional control task with both materials | Deaf signers showed more posterior temporal and less fronto-parietal activation as well as increased resting state connectivity between frontal and temporal regions. These effects were independent of the linguistic characteristics of the stimuli |