Literature DB >> 25999636

Language and iconic gesture use in procedural discourse by speakers with aphasia.

Madeleine Pritchard1, Lucy Dipper1, Gary Morgan1, Naomi Cocks2.   

Abstract

Background: Conveying instructions is an everyday use of language, and gestures are likely to be a key feature of this. Although co-speech iconic gestures are tightly integrated with language, and people with aphasia (PWA) produce procedural discourses impaired at a linguistic level, no previous studies have investigated how PWA use co-speech iconic gestures in these contexts. Aims: This study investigated how PWA communicated meaning using gesture and language in procedural discourses, compared with neurologically healthy people (NHP). We aimed to identify the relative relationship of gesture and speech, in the context of impaired language, both overall and in individual events. Methods & Procedures: Twenty-nine PWA and 29 NHP produced two procedural discourses. The structure and semantic content of language of the whole discourses were analysed through predicate argument structure and spatial motor terms, and gestures were analysed for frequency and semantic form. Gesture and language were analysed in two key events, to determine the relative information presented in each modality. Outcomes &
Results: PWA and NHP used similar frequencies and forms of gestures, although PWA used syntactically simpler language and fewer spatial words. This meant, overall, relatively more information was present in PWA gesture. This finding was also reflected in the key events, where PWA used gestures conveying rich semantic information alongside semantically impoverished language more often than NHP. Conclusions: PWA gestures, containing semantic information omitted from the concurrent speech, may help listeners with meaning when language is impaired. This finding indicates gesture should be included in clinical assessments of meaning-making.

Entities:  

Keywords:  aphasia; discourse; iconic gesture; procedural discourse; semantic weight

Year:  2015        PMID: 25999636      PMCID: PMC4409036          DOI: 10.1080/02687038.2014.993912

Source DB:  PubMed          Journal:  Aphasiology        ISSN: 0268-7038            Impact factor:   2.773


When communicating, speakers present information to construct meaning (Halliday, 2004), which involves spoken language but which is also likely to incorporate additional modalities, including co-speech gesture. Moreover, everyday communication includes a range of different discourses (Davidson, Worrall, & Hickson, 2003), and the relative informational load carried by spoken language and gesture will vary according to the genre of discourse being produced (Cocks, Hird, & Kirsner, 2007; Ulatowska, North, & Macaluso-Haynes, 1981). In evaluating the communication abilities of people with aphasia (PWA), it is therefore essential to understand how gesture and spoken language share the communicative burden in different discourse types. In this study, the focus is on procedural discourse. Variation in language form and content can be grammatical, lexical, and/or semantic (Cocks, Dipper, Middleton, & Morgan, 2011; Dipper, Cocks, Rowe, & Morgan, 2011; Kendon, 2004; Kita, 2000; Kita & Özyürek, 2003; Pritchard, Cocks, & Dipper, 2013). For example, in a study completed by Kita and Özyürek (2003), the shape of speakers’ gesture was affected by their lexical choices of the verbs go, fly, or swing. Semantic differences between topics are also likely to affect gesture: in a discourse involving a large amount of spatial information, a large amount of spatial and/or motor meaning can be expected in gesture (Hostetter, Alibali, & Kita, 2007).

Aphasia and discourse

Evidence from naturalistic observation of older adults indicates that language to “inform and explain” accounts for 12.3% of total communicative activity, making it one of the most frequent everyday uses of language (Davidson et al., 2003). Giving instructions is likely to be particularly important in contexts where an individual is unable to complete a task themselves, such as adults who have difficulties completing activities of daily living independently. For example, a speaker with mobility difficulties may instruct a support worker how they like their tea made, or give idiosyncratic instructions for how to operate their washing machine. Despite this, very limited research has explored how speakers communicate when informing and explaining. Procedural discourse is one example of language to inform and explain and was the focus of the current study. These discourses are goal-orientated monologues, where a speaker gives instructions on completing a procedure. Previous research has asked speakers to describe how to make scrambled eggs, shop in a supermarket, change the wheel of a car, and wrap a box in paper for a present (Brady, Armstrong, & Mackenzie, 2006; Cocks et al., 2007; Nicholas & Brookshire, 1995; Shadden, Burnette, Eikenberry, & Dibrezzo, 1991; Ulatowska, Weiss Doyel, Freedmam Stern, Macaluso-Haynes, & North, 1983; Ulatowska et al., 1981). Procedural tasks require a speaker to identify the main steps in a process and communicate them clearly, which will involve spoken language as well as additional modalities, especially gesture. Gesture may play a larger role in conveying meaning when language is impaired, such as in adults with aphasia following stroke. The research completed to date on procedural discourses indicates that they have specific attributes, including structurally simpler language and particular grammatical constructions, and that they contain more co-speech iconic gestures than other genres. In comparison with fictional narratives, the procedural discourses produced by neurologically healthy people (NHP) tend to contain fewer t-units (defined as a clause plus any dependent or relative clauses); contain less complex language overall; and contain fewer subordinate clauses, which indicates an overall reduction in linguistic complexity (Shadden et al., 1991; Ulatowska et al., 1981). PWA use structurally less complex language than NHP consistently across a range of genres, measured through sentence structures, predicate argument structures (PASs), t-units, and clauses per t-unit (Berndt, Haendiges, Mitchum, & Sandson, 1997; Cruice, Pritchard, & Dipper, 2014; Ulatowska et al., 1981, 1983; Webster, Franklin, & Howard, 2007). This reduction in complexity by PWA in comparison to NHP has also been shown to differ at a single word and semantic level, which is likely to affect procedural discourse. First, for example, PWA use fewer Correct Information Units (defined as any single word, intelligible, informative, and relevant in context) in discourse than NHP (Nicholas & Brookshire, 1993). Second, PWA also use fewer types and tokens of spatial language in spatial tasks than NHP (Johnson, Cocks, & Dipper, 2013). Finally, speakers with aphasia and verb production deficits also use verbs differently to speakers without verb production deficits, using a high level of semantically “light” verbs containing little semantic information, such as come, go, make, take, get, give, do, have, be, and put (Berndt et al., 1997). Overall, these findings create a picture of PWA conveying less information in language, in a context where spoken language may already be structurally less complex. This means that additional communicative modalities are likely to become more important, raising questions about the role of gesture in such discourses, and how aphasia affects this.

Gesture production

What do we mean when we talk about gesture, and why might it be affected by language impairment? Iconic gestures are not formalised, occur alongside spoken language, and reflect the semantic content of that language (Kendon, 2004; McNeill, 2000). The exact nature of the relationship between speech and gesture is the subject of theoretical debate, both in terms of where gesture and language interact and in terms of its function. The Sketch Model (de Ruiter, 2000) asserts that gesture stems from conceptual imagery in working memory. The Growth Point Hypothesis (McNeill, & Duncan, 2000) describes gesture and language arising at a common semantic level, and the Interface Hypothesis (Kita & Özyürek, 2003) describes gesture interacting with spoken language during lexical and clausal packaging. Finally, the Lexical Retrieval hypothesis describes gesture in fluent speech arising at a conceptual level, but also being prompted by lexical retrieval failure, which then results in visualisation in working memory that supports lexical access (Krauss, Chen, & Gottesman, 2000). Consequently, each of these models can be used to explain why speakers with aphasia might gesture more: for communicative reasons (de Ruiter, 2000; Kita & Özyürek, 2003; McNeill & Duncan, 2000), or to facilitate lexical access (Krauss et al., 2000). Moreover, those models in which gesture serves a communicative role (de Ruiter, 2000; Kita & Özyürek, 2003; McNeill & Duncan, 2000) can also explain how gesture and spoken language together share the communicative burden in aphasia where, as well as the prelinguistic plan and constraints on lexical and clausal packaging, the two modalities must also negotiate spoken language production difficulties.

Gesture and spatial discourse

Speakers with no neurological impairments produce more gesture within procedural discourse (Cocks et al., 2007; Feyereisen & Havard, 1999), a finding that is likely to be due to the spatial motor properties of the discourse. This gesture is likely to have a communicative role, whether or not this is the speaker’s intention, because listeners comprehend spatial relationships more accurately when speakers gesture. Listeners’ comprehension is particularly aided when the information is difficult to code in language or relates to the relative position and size of objects (Beattie & Shovelton, 1999; Graham & Argyle, 1975). There is also some evidence of a trade-off between gesture and language: speakers who gesture spontaneously when describing spatial configurations omit spatial information from their language, whilst those who don’t gesture do not (Melinger & Levelt, 2004). These findings are compatible with proposals by McNeill and Duncan (2000), and by de Ruiter (2000), that gesture and language interact, and that spatial information can be packaged relatively efficiently in a gestural format.

Gesture and aphasia

Although no research to date has focused specifically on aphasia, gesture, and procedural discourse, a body of research does outline how PWA use gesture in other discourses. It is a well-established finding that PWA use more gesture in discourse than NHP (Carlomagno & Cristilli, 2006; Cicone, Wapner, Foldi, Zurif, & Gardner, 1979; Feyereisen, 1983; Hadar, Burstein, Krauss, & Soroker, 1998; Herrmann, Reichle, Lucius-Hoene, Wallesche, & Johannsen-Horbache, 1988; Lanyon & Rose, 2009; Orgassa, 2005). However, frequency data are not clinically informative, as it does not tell us how a speaker uses their gesture, nor how it contributes to their discourse. A speaker’s language profile is likely to affect their gesture, with speakers with less impaired language producing gesture with more content. Mol, Krahmer, and van de Sandt-Koenderman (2013) found that of 26 speakers with aphasia, those with mild aphasia produced co-speech gestures that were more comprehensible than those with severe aphasia. This indicates that gesture mirrors language, rather than compensates for it. In contrast, other studies show that speakers with aphasia produce gestures that are semantically similar to NHP (Carlomagno & Cristilli, 2006) and gestures that differ in content from speech (Dipper et al., 2011; Hogrefe, Ziegler, Wiesmayer, Weidinger, & Goldenberg, 2013), suggesting that the content of speech and gesture can depict different information. However, although gesture and language diverge, there is evidence they are not completely separate, and that speakers’ language profile affects the form and content of gesture. Hogrefe, Ziegler, Weidinger, and Goldenberg (2012) found that speakers with severe aphasia and good semantics produced a wider range of handshapes, and Cocks, Dipper, Pritchard, and Morgan (2013) found that speakers with aphasia and good non-verbal semantic knowledge produced gestures containing a high proportion of semantically rich manner information, describing how something moves. Finally, PWA with good semantics also produce additional gestures, linked to the lexical target, when experiencing word finding difficulty for objects (Cocks et al., 2011, 2013; Pritchard et al., 2013). These studies indicate that the ability to convey semantic information in gesture relies upon underlying retained semantic skills. Even if speakers’ “semantically dense” gestures are not produced with purposely communicative intent, as suggested by some models (Hadar et al., 1998), they are likely to serve a pragmatic function, allowing a listener to gather additional meaning when a speaker’s language is impaired. This means that meaningful gestures have an important role in contextual communication, regardless of the purpose they serve for the speaker: a gesture being intended communicatively and being understood communicatively are two separate entities (de Ruiter, 2000). Due to the clear potential for a listener gathering meaning from both speech and gesture, there is likely to be considerable value in considering the relative roles of speech and gesture in clinical language assessment, in order that speakers’ contextual communication skills are acknowledged.

The current study

Although procedural discourse is important in daily interaction, it is not yet known how PWA use gesture and language such discourses. The current study aimed to assess how speakers present meaning in speech and gesture, by comparing the overall patterns of spontaneous iconic gesture and language used by PWA and NHP in two procedural discourses, and in two events within the discourses. Hypotheses were that Hypothesis 1. PWA would produce syntactically and semantically impoverished language by comparison to NHP, in line with the findings of previous research (Berndt et al., 1997; Cruice et al., 2014; Ulatowska et al., 1981, 1983; Webster et al., 2007); Hypothesis 2. PWA would produce more gestures than NHP (Carlomagno & Cristilli, 2006; Cicone et al., 1979; Feyereisen, 1983; Hadar et al., 1998; Herrmann et al., 1988; Lanyon & Rose, 2009; Orgassa, 2005); Hypothesis 3. PWA gestures would contain information omitted from the concurrent speech, in line with theoretical models, and previous research (de Ruiter, 2000; Dipper et al., 2011; McNeill & Duncan, 2000; Melinger & Levelt, 2004).

Method

Participants

Twenty-nine PWA (12 female, 17 male) and 29 NHP (18 female, 11 male) took part in the study, recruited for a larger UK study on gesture production and aphasia (Cocks et al., 2013). Participants lived in London and the South East of the UK. All participants completed the Action Research Arm Test (Lyle, 1981), which tested strength and range of movement in the upper limbs. Five PWA scored 0/57 for the right upper limb and two scored 0/57 for the left upper limb, indicating complete paralysis of this limb. One PWA scored 12/57 for the right upper limb and one scored 3/57 for the right upper limb, indicating limited use. Those participants who had limited use or complete paralysis of one upper limb had full use of the other upper limb. No participant with aphasia obtained scores indicating limb apraxia on the Birmingham University Praxis Screen (Bickerton et al., 2006) or Screen for Limb Apraxia (Poeck, 1986). Those participants who had limb paralysis completed this assessment with their functional limb. These participants also used their functional limb to gesture, with no differences in frequency or form to those participants who had functional use of both limbs. PWA were between 16 months and 32 years post-stroke, were recruited via community stroke groups, and presented with a range of mild-to-moderate aphasia profiles as defined by the Western Aphasia Battery-Revised (WAB-R) (Kertesz, 2006). The majority of speakers presented with anomic aphasia (n = 16), followed by conduction aphasia (n = 6), Broca’s aphasia (n = 3), and Wernicke’s aphasia (n = 4). No participants self-reported any difficulties with speech such as verbal apraxia, nor was this noted during participation. All participants spoke English as their first language and were excluded if they had any coexisting neurological conditions or were unable to give informed consent to participate. Participants’ average age was 60.9 years (SD = 14.85). Seven participants had completed tertiary level education, 21 had completed secondary school level education, and one had completed only junior level education. Please refer to Table 1 for a summary of participants’ WAB scores and to Cocks et al. (2013) for a more detailed description of participants’ language profiles.
Table 1.

Summary of participants’ WAB scores.

Aphasia subtypeNAphasia quotient (AQ)
MeanRangeSD
Anomic1683.771.2–89.74.3
Conduction662.940.1–8012.4
Wernicke’s363.355.7–71.54.3
Broca’s457.740.1–69.69.9
Summary of participants’ WAB scores. NHP were matched for age (mean = 59.69, SD = 13.63). No participant presented with any limb weakness or apraxia. NHP did not have a history of psychiatric disorder, neurological illness or insult, nor any other serious medical condition. All NHP were right-handed and all spoke English as a first language. Ethical approval for the study was obtained from City University Ethics Committee.

Procedure

Participants were invited to take part in “The describing events project”, focusing upon the impact of aphasia on discourse production. Data were collected at a university clinic, a day centre, or participants’ own homes. Speakers produced discourses in response to the questions “can you tell me how you would wrap a box in paper for a present?” (gift wrapping procedure) and “can you tell me how you would change the wheel of a car?” (wheel changing procedure). After the question was posed, participants were given positive but neutral encouragement, such as nodding and smiling. Participants were considered to have finished when they stopped talking for >10 s or gave some indication of being finished, for example, saying “and that’s it”. Participants were filmed on a digital video camera, placed approximately 1 m away from them, and positioned to obtain a front view.

Analysis

Discourse analysis

Discourse analysis aimed to address Hypotheses 1 and 2, relating to overall patterns of gesture and language use. All discourses were transcribed verbatim using MS Office Word and Windows Media Player by the first author, a qualified Speech and Language Therapist (SLT) with post-qualification experience of working with adults with aphasia. See Appendix for example transcripts. Language was coded by hand in MS Office Word, collated in MS Office Excel, and analysed for number of words, PAS (Webster et al., 2007), and frequency of spatial motor terms (SMTs; Hostetter et al., 2007). The number of words, rather than any other measure such as mean length of utterance, was used so that findings were comparable with previous gesture frequency studies (e.g., Cocks et al., 2013; Sekine, Rose, Foster, Attard, & Lanyon, 2013). The PAS analysis tallied the number of internal arguments used with each main verb. Examples of 0, 1, and 2 argument structures are given in Table 2. Using the data, the average PAS score was calculated, using the formula (total number of arguments produced/total number of predicates produced). The resulting figure describes the average complexity of structures produced, termed “mean PAS complexity” by Webster et al. (2007). SMTs described how and where things moved and included actions (e.g., turn), descriptors (e.g., tightly and opposite), and positional language (e.g., under).
Table 2.

Examples of 0, 1, and 2 argument structures.

Number of argumentsExamples
0“and then twist
1fold [the paper]”sellotape [that]”
2“you put [the box] [down]”drop [the jack] [down]”
Examples of 0, 1, and 2 argument structures. Iconic gestures (as defined by McNeill, 2000), occurring concurrent to language or within a word searching episode, were identified with the sound on, as this definition encompasses gestures which have a formal relation to the semantic content of the concurrent language. These were tagged using the programme EUDICO Linguistic Annotator (Wittenburg, Brugman, Russel, Klassmann, & Sloetjes, 2006). Gestures contained at least a stroke phase (Kendon, 2004) and were coded as depicting semantic features outlined in previous studies (Allen et al., 2007; Cocks et al., 2013; Kita & Özyürek, 2003; Pritchard et al., 2013), described in Cocks et al. (2011): Path: gesture describes the direction in which something moves; for example, the thumb moves to the right alongside the speech “you put the sellotape on”. Manner: gesture describes some aspect of the way in which the action is completed; for example, the hand makes a holding shape with the palm outward and then twists 90° to the left alongside the speech “you need to do it up”. Attribute: gesture describes some feature of the shape or size of the item; for example, finger and thumb form a pincer grip approximately 5 cm apart to demonstrate the size of the item alongside the speech “take the wheel nut”. Shape outline: hands trace or mould the shape of the object; for example, both hands tracing the outline of a circle whilst saying “wheel”. Other: gesture is clearly iconic, but its relationship to co-speech is unclear. Gestures that contained conflated information (e.g., manner and path information) were counted twice: for example, once for path and once for manner.

Key event analysis

The key event analysis addressed Hypothesis 3, exploring how speakers used language and gesture to describe two key events within the discourses. The key events were selected as they are essential steps in the discourses, had been identified as such in previous work (Hostetter et al., 2007), and were present in the discourses of most speakers. In the gift wrapping procedure, the key event was the point where the paper is first folded over the box (fold event), and in the wheel changing procedure, the key event was the point where the new wheel is put onto the car (put event). Speakers who did not describe the event were omitted from this analysis only. The main verbs were coded based on semantic weight, with the verbs come, go, make, take, get, give, do, have, be, and put coded as semantically light, and all others coded as semantically heavy (Berndt et al., 1997; Hostetter et al., 2007). Gesture was synchronised if it occurred anywhere within the clause describing the “fold” or “put event”, so, for example, either where it occurred alongside the verb itself or whether it occurred alongside the preposition in an argument phrase. This is because the majority of iconic gestures occur alongside verbs in healthy speakers (Hadar & Krauss, 1999), and the semantics of the verb has been shown to directly influence the shape of a gesture (the verb “swing” in Kita & Özyürek, 2003). There is not a corresponding heavy/light distinction made in gesture categorisation, but gestures are commonly analysed for the same semantic components used in language semantics – for example, path, and manner (Kita & Özyürek, 2003). In spoken language, “light” verbs are highly frequent and convey basic information about events or states such as the fact that something moved. The gestures most similar to this in terms of both frequency and paucity of semantics conveyed are path gestures. Therefore, in this “gesture weight” analysis, “path” gestures were considered light, as the hand movement depicts minimal semantic information describing the path of movement only. See Table 3 for examples of light and heavy language and gesture.
Table 3.

Examples of light and heavy language and gesture in the “fold” event.

 GestureLanguage
LightParticipant moves hand to the left 30 cm (describing path of movement only)“You do the paper over the box”
HeavyParticipant’s handshape depicts holding a piece of paper and folding it over the box (describing path, manner, and attribute)“You fold the paper over the box”
MatchParticipant’s handshape depicts holding a piece of paper, and folding it over the box (describing path, manner, and attribute)“You fold the paper over the box”
MismatchParticipant moves hand to the left 30 cm (describing path of movement only)“You fold the paper over the box”
Examples of light and heavy language and gesture in the “fold” event. Finally, the gesture and the language were considered together, to determine whether one modality carried more semantic weight. For example, if a speaker said “you do the paper” whilst producing a gesture depicting holding the paper and folding it over, the language and gesture were coded as mismatched, as the semantics of the gesture indicated manner of moment and attributes of the object being moved, whilst the lexical semantics of the verb do did not.

Statistics

Data were analysed using IBM SPSS Statistics 21 software and compared using 2-tailed t-tests, with significance set at p < 0.05. Frequency of gestures, number of words per gesture, and number of the internal arguments used by each group were completed using a mixed analysis of variance, with group as the between-subjects variable and number of arguments as the within-subject variable. Post hoc comparisons were completed using 2-tailed tests, with p < 0.005, to adjust for multiple comparisons. Key event comparisons between groups were completed using Fisher’s exact tests, p < 0.05.

Reliability

Coding was completed by an English-speaking SLT with post-qualification experience with adults with aphasia. A second rater, also a qualified English speaking SLT with post-qualification experience with adults with aphasia coded 10% of the data for language measures (word count, PAS, SMTs, and verb semantic weight) and gestures (frequency and form). Interrater reliability is reported in Table 4. Differences in coding were resolved through discussion.
Table 4.

Interrater reliability analyses.

AnalysisReliability level (%)
Transcription94
PAS92.8
SMTs89
Verb semantic weight100
Gesture coding84.7
Interrater reliability analyses.

Results

The following section will present speakers’ overall use of gesture and language in the discourses and then the findings for the key events within each discourse in gesture, language, and relative semantic weight.

Procedural discourse analysis

Overall language

PWA produced significantly shorter discourses, using fewer words to describe both procedures (wheel changing procedure, t(51) = 1.95, p < 0.05; gift wrapping procedure, t(53) = 2.93, p < 0.05). Refer to Tables 5 and 6 for figures.
Table 5.

Means of language and gesture used in the procedural discourses by PWA.

 MeanSD
Wheel changing procedure
 # words90.0361.26
 # verbs9.367.23
 PAS complexity1.190.57
 # SMTs16.7411.29
 % SMTs19.38.37
 # gestures7.615.95
 Words per gesture16.821.05
Gift wrapping procedure  
 # words69.3441.97
 # Verbs7.15.62
 PAS complexity1.140.45
 # SMTs14.339.98
 % SMTS19.5610.25
 # gestures7.616.96
 Words per gesture19.3321.22
Table 6.

Means of language and gesture used in the procedural discourses by NHP.

 MeanSD
Wheel changing procedure  
 # words119.2275.57
 # verbs15.5510.35
 PAS complexity1.450.35
 # SMTs29.3117.92
 % SMTs23.563.31
 # gestures6.554.7
 Words per gesture14.0410.34
Gift wrapping procedure  
 # words100.0966.96
 # Verbs13.967.31
 PAS complexity1.420.35
 # SMTs15.8614.73
 % SMTS24.327.14
 # gestures6.324.44
 Words per gesture21.4316.98
Means of language and gesture used in the procedural discourses by PWA. Means of language and gesture used in the procedural discourses by NHP.

PAS

As predicted, PWA used fewer verbs in both the wheel changing procedure, t(54) = 3.59, p < 0.05, and gift wrapping procedure, t(51) = 3.48, p < 0.05. This was reflected in lower PAS scores for both the wheel changing procedure t(55) = 1.89, p < 0.05, and gift wrapping procedure, t(53) = 3.34, p < 0.05. Refer to Tables 5 and 6 for figures. Speakers in both groups used 0, 1, and 2 argument PASs. Each group behaved consistently in the two discourse tasks: NHP used 2 argument structures most frequently, followed by 1 argument and 0 argument structures, whilst PWA used 1 argument structures most frequently, followed by 2 argument and 0 argument structures (see Figures 1 and 2). There was a significant interaction between group and number of arguments used for the wheel changing procedure, F(2) = 2.36, p < 0.05, and gift wrapping procedure, F(2) = 2.26, p < 0.05, indicating the two groups had different levels of syntactic complexity. Post hoc comparisons indicated a significant difference between 2 argument structures between groups for both discourses, with NHP using a higher percentage of these structures (wheel changing procedure PWA, m = 38.18%, SD = 27.45; NHP, m = 49.17%, SD = 22.16), t(55) = 2.34, p < 0.005, gift wrapping procedure (PWA m = 31.08%, SD = 25.34; NHP m = 44.55%, SD = 16.55), t(54) = 3.59, p < 0.005.
Figure 1.

Gift wrapping procedure: PAS.

Figure 2.

Wheel changing procedure: PAS.

Gift wrapping procedure: PAS. Wheel changing procedure: PAS.

SMTs

As predicted, PWA used fewer SMTs overall than NHP in both procedures: wheel changing procedure, t(51) = 3.07, p < 0.05, and gift wrapping procedure, t(53) = 3.38, p < 0.05. They also used fewer SMTs as a percentage of overall words used: wheel changing procedure, t(51) = 2.2, p < 0.05, and gift wrapping procedure, t(56) = 2.2, p < 0.05. Refer to Tables 5 and 6 for figures.

Overall gesture frequency and form

The frequency and form of participants’ gestures were analysed to test Hypothesis 2 that PWA would produce more gestures than NHP and that these gestures would differ in form from those produced by NHP. Contrary to expectations, there was no effect of group or discourse on the frequency of gestures, F(2, 58) = 0.29, p > 0.05. The frequency of words per gesture was similar across both groups for both discourses, F(2, 58) = 0.663, p > 0.05. Also contrary to expectations, across both tasks, both groups produced gestures containing similar semantic information, indicated by no interaction between group and semantic content of gesture, wheel changing procedure, F(4, 2.717) = 0.775, p > 0.05, and gift wrapping procedure F(4, 2.712) = 1.489, p > 0.05. This is depicted in Figures 3 and 4.
Figure 3.

Gift wrapping procedure: Percentage of gestures containing specific semantic features of form.

Figure 4.

Wheel changing procedure: Percentage of gestures containing specific semantic features of form.

Gift wrapping procedure: Percentage of gestures containing specific semantic features of form. Wheel changing procedure: Percentage of gestures containing specific semantic features of form. Key events were analysed, in order to test Hypothesis 3, that PWA gesture would differ from their language. Patterns of language and gesture use were different in the two events.

Gift wrapping procedure: “Fold” event

28 NHP and 21 PWA used gesture and language to describe the fold event within the gift wrapping procedure. The NHP group used a higher frequency of heavy verbs than PWA (NHP = 26/28; PWA = 13/21, Fisher’s exact p < 0.05). Both groups used the verbs “fold” and “bring” most frequently to describe this event (see Table 7).
Table 7.

Gift wrapping procedure Fold event verbs used by speakers.

GroupVerb used
NHP (n = 28)fold (16), wrap (5), bring (3), pull (1), put (1), place (1), lift (1)
PWA (n = 21)fold (4), wrap (4), put (4), no verb used (3), negotiate (1), get (1), is (1), wind (1), do (1), stick (1)
Gift wrapping procedure Fold event verbs used by speakers. Most participants used semantically heavy gestures to describe this event (NHP = 28/28; PWA = 20/21, Fisher’s exact, p < 0.05). As shown in Figure 5, the majority of these contained aspects of both path and manner.
Figure 5.

Gift wrapping procedure Fold event: Percentage of participants’ gestures containing features of path, attribute, and manner.

Gift wrapping procedure Fold event: Percentage of participants’ gestures containing features of path, attribute, and manner. There was a difference between groups in whether the information conveyed in speech and gesture matched (Fisher’s exact, p < 0.05). NHP communicated similar information in both modalities: 25/26 of participants used a semantically heavy verb with a semantically rich gesture, for example, saying “you fold it over” [heavy verb], alongside a gesture that demonstrated the size of the paper and the way it would be held whilst being folded [heavy gesture]. In line with expectations, some speakers with aphasia presented different information in speech and gesture. Just 11/21 of the gestures and language produced by the PWA matched, with all mismatched speech and gesture combinations being a heavy gesture used with a light verb. See Table 3 for an example of this.

Wheel changing procedure: “Put” event

Nineteen NHP and 17 PWA described the put event using gesture and language. Both groups used “put” most frequently to describe this event (see Table 8).
Table 8.

Wheel changing procedure Put event verbs used by speakers.

GroupVerb used
NHP (n = 19)put (10), do (2), change (2), place (2), stick (1), bring (1), slot (1)
PWA (n = 17)put (12), do (1), change (1), figure (1), replace (1), fix (1)
Wheel changing procedure Put event verbs used by speakers. As shown in Figure 6, the majority of gestures produced by all participants were “heavy” gestures (NHP = 17/19; PWA = 17/17, Fisher’s exact p > 0.05), and the majority of these contained aspects of both path and manner.
Figure 6.

Wheel changing procedure Put event: Percentage of participants’ gestures containing features of path, attribute, and manner.

Wheel changing procedure Put event: Percentage of participants’ gestures containing features of path, attribute, and manner. Both groups used similar patterns of language and gesture, with the content of language and gesture differing (NHP mismatched = 12/19; PWA matched = 15/17, Fisher’s exact, p > 0.05). These mismatches were all semantically heavy gestures with semantically light verbs. In summary, PWA used language that was syntactically and semantically simpler than NHP for their procedural discourses, whilst producing gestures that conveyed a similar amount of semantic information to NHP. In the key event analysis, both groups presented information similarly in the light put event, but differently in the fold event, where PWA used gestures that were semantically heavy alongside semantically light language.

Discussion

This study investigated how PWA presented information using gesture and language in procedural discourses, compared with NHP. The study aimed to identify the information presented in gesture in the context of impaired language, both overall and in individual events. This was completed with the dual purpose of informing clinical assessment of communication and understanding the multimodal nature of language. Ours is the first study that has focused on the relative roles played by gesture and language in the procedural discourses of PWA. As predicted, and in line with previous research, PWA used syntactically less complex language in the discourses. Contrary to our expectations and the findings of previous studies, overall, PWA used gestures with semantically similar forms to NHP. However, there were differences in the content of gesture and language when describing the key events, for both NHP and PWA. Both groups produced gesture and language that did not match, such as a semantically heavy gesture alongside a semantically light verb. These findings are likely to be due to speakers’ lexical choices and the spatial nature of the discourse and are informative for clinical practice. They also contribute to the debate regarding the role of gesture in language production. As hypothesised, and consistent with previous studies (Cruice et al., 2014; Johnson et al., 2013; Ulatowska et al., 1981; Webster et al., 2007), PWA in the current study produced syntactically and semantically less complex language in their discourses. However, contrary to hypotheses and the majority of previous studies (Cicone et al., 1979; Cocks et al., 2013; Feyereisen, 1983; Hadar et al., 1998; Herrmann et al., 1988; Lanyon & Rose, 2009; Orgassa, 2005; Sekine et al., 2013), both groups used similar quantities and types of gestures. The fact that both groups in this study used a comparable number of gestures, but that PWA used fewer SMTs than NHP, indicates that PWA presented more spatial motor information in gesture than in language. What is striking in these findings is that PWA gesture was accurate despite their impoverished language. Speakers from both groups presented information omitted from spoken language in gesture, in line with our predictions. In the semantically heavy fold event, NHP produced the same information in speech and language, whilst PWA omitted information in their speech, and included it within their gesture. By contrast, in the semantically lighter put event, both groups used spoken language containing limited information, alongside gestures that contained more semantic information. There is an evidence base in the literature about the semantic complexity of verbs used by PWA (Barde, Schwartz, & Boronat, 2006; Berndt et al., 1997; Faroqi-Shah & Graham, 2011; Gordon, 2008), and the verb put has been analysed as light by others (Gordon, 2008). To our knowledge, the same variable has not been explored in gesture, and although there is not a corresponding heavy/light distinction made in gesture categorisation, gestures are commonly analysed for the same semantic components used in language semantics – for example, path, and manner (Kita & Özyürek, 2003). Path gestures display characteristics the most similar to light verbs, in terms of being highly frequent and in terms of conveying sparse information, such as the fact something moved. This mismatch in information presented in gesture and speech could be explained through the syntactic and semantic properties of the verb put, which contains meaning in its arguments that may be conveyed in gesture. However, the fact that speakers with aphasia also produced the same pattern of light language and heavy gesture in the heavy fold event is likely to indicate that when speech contains limited information, spontaneous co-speech gesture can carry some of the communicative load. This dissociation between language and gesture is consistent with theoretical models (de Ruiter, 2000; McNeill & Duncan, 2000) and previous studies on spatial communication in NHP and in speakers with aphasia (Dipper et al., 2011; Graham & Argyle, 1975; Melinger & Levelt, 2004). With all communication, listeners gather meaning from speech, but also from additional modalities, and so gesture is a key element in procedural discourses, due to their spatial content. The dissociation demonstrated within the current study, between speech and gesture, indicates that it can be a normal pattern to communicate more information in gesture than in language. Future research should focus on the relationship between specific aspects of language semantics and gestures in specific events and different discourse genres. These findings are of importance clinically. First, there should be different expectations from speech and gesture, depending on genre, due to differences in lexical choices and semantics. Second, it is likely that if a speaker with aphasia is not producing heavy gestures in contexts where they would be expected, such as in procedures, this would be an indication that their semantic system is impaired. Finally, in terms of clinical assessment and intervention, both modalities need to be taken into account. Whilst a speaker may not intentionally produce gesture to directly compensate for impaired language, the use of semantically rich gesture alongside semantically impoverished language is likely to be useful to the listener and may well contribute to the observation that some speakers with aphasia are able to communicate better than they talk (Holland, 1982). This finding is likely to be specific to speakers’ language profile: PWA in this study had mild-to-moderate aphasia, and produced gesture that was concurrent to language, not in its place. Future research should explore the impact of language profile, such as aphasia type, fluency, and severity, on language and gestures in discourses for different functions and levels of complexity. The current study confirmed previous findings that the language of speakers with aphasia is impaired in form and content in procedural discourses. It also presented new evidence that the gestures used by PWA in this discourse genre carry rich semantic information and should be considered in clinical assessment.

Funding

This work was supported by a City University London Pump Priming Grant and the Dunhill Medical Trust [grant reference: R171/0710].
  20 in total

1.  Semantic weight and verb retrieval in aphasia.

Authors:  Laura H F Barde; Myrna F Schwartz; Consuelo B Boronat
Journal:  Brain Lang       Date:  2005-12-15       Impact factor: 2.381

2.  Nonverbal communication as a compensative strategy for severely nonfluent aphasics? A quantitative approach.

Authors:  M Herrmann; T Reichle; G Lucius-Hoene; C W Wallesch; H Johannsen-Horbach
Journal:  Brain Lang       Date:  1988-01       Impact factor: 2.381

3.  Presence, completeness, and accuracy of main concepts in the connected speech of non-brain-damaged adults and adults with aphasia.

Authors:  L E Nicholas; R H Brookshire
Journal:  J Speech Hear Res       Date:  1995-02

4.  A system for quantifying the informativeness and efficiency of the connected speech of adults with aphasia.

Authors:  L E Nicholas; R H Brookshire
Journal:  J Speech Hear Res       Date:  1993-04

5.  Production of procedural discourse in aphasia.

Authors:  H K Ulatowska; A W Doyel; R F Stern; S M Haynes; A J North
Journal:  Brain Lang       Date:  1983-03       Impact factor: 2.381

6.  A performance test for assessment of upper limb function in physical rehabilitation treatment and research.

Authors:  R C Lyle
Journal:  Int J Rehabil Res       Date:  1981       Impact factor: 1.479

7.  The relation between gesture and language in aphasic communication.

Authors:  M Cicone; W Wapner; N Foldi; E Zurif; H Gardner
Journal:  Brain Lang       Date:  1979-11       Impact factor: 2.381

8.  Use of spatial communication in aphasia.

Authors:  Sarah Johnson; Naomi Cocks; Lucy Dipper
Journal:  Int J Lang Commun Disord       Date:  2013-06-13       Impact factor: 3.020

9.  Verb retrieval in aphasia. 2. Relationship to sentence processing.

Authors:  R S Berndt; A N Haendiges; C C Mitchum; J Sandson
Journal:  Brain Lang       Date:  1997-01       Impact factor: 2.381

10.  The clinical examination for motor apraxia.

Authors:  K Poeck
Journal:  Neuropsychologia       Date:  1986       Impact factor: 3.139

View more
  8 in total

1.  The relationship between co-speech gesture production and macrolinguistic discourse abilities in people with focal brain injury.

Authors:  Seda Akbıyık; Ayşenur Karaduman; Tilbe Göksun; Anjan Chatterjee
Journal:  Neuropsychologia       Date:  2018-07-06       Impact factor: 3.139

2.  Measuring discourse coherence in anomic aphasia using Rhetorical Structure Theory.

Authors:  Anthony Pak-Hin Kong; Anastasia Linnik; Sam-Po Law; Waisa Wai-Man Shum
Journal:  Int J Speech Lang Pathol       Date:  2017-03-17       Impact factor: 2.484

3.  Iconicity as Multimodal, Polysemiotic, and Plurifunctional.

Authors:  Gabrielle Hodge; Lindsay Ferrara
Journal:  Front Psychol       Date:  2022-06-13

4.  Task-Specific Iconic Gesturing During Spoken Discourse in Aphasia.

Authors:  Brielle C Stark; Caroline Cofoid
Journal:  Am J Speech Lang Pathol       Date:  2021-05-25       Impact factor: 4.018

5.  Suggestions for Improving the Investigation of Gesture in Aphasia.

Authors:  Brielle C Stark; Sharice Clough; Melissa Duff
Journal:  J Speech Lang Hear Res       Date:  2021-09-15       Impact factor: 2.674

6.  A Comparison of Three Discourse Elicitation Methods in Aphasia and Age-Matched Adults: Implications for Language Assessment and Outcome.

Authors:  Brielle C Stark
Journal:  Am J Speech Lang Pathol       Date:  2019-06-10       Impact factor: 4.018

7.  Guided Embodiment and Potential Applications of Tutor Systems in Language Instruction and Rehabilitation.

Authors:  Manuela Macedonia; Florian Hammer; Otto Weichselbaum
Journal:  Front Psychol       Date:  2018-06-13

8.  Transcranial Magnetic Stimulation and Working Memory Training to Address Language Impairments in Aphasia: A Case Study.

Authors:  Despina Kranou-Economidou; Maria Kambanaros
Journal:  Behav Neurol       Date:  2021-11-25       Impact factor: 3.342

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.