Literature DB >> 30215809

Communication of emotion via drumming: dual-brain imaging with functional near-infrared spectroscopy.

Rahil Rojiani1, Xian Zhang1, Adam Noah1, Joy Hirsch1,2,3,4.   

Abstract

Nonverbal communication of emotion is essential to human interaction and relevant to many clinical applications, yet it is an understudied topic in social neuroscience. Drumming is an ancient nonverbal communication modality for expression of emotion that has not been previously investigated in this context. We investigate the neural response to live, natural communication of emotion via drumming using a novel dual-brain neuroimaging paradigm. Hemodynamic signals were acquired using whole-head functional near-infrared spectroscopy (fNIRS). Dyads of 36 subjects participated in two conditions, drumming and talking, alternating between 'sending' (drumming or talking to partner) and 'receiving' (listening to partner) in response to emotionally salient images from the International Affective Picture System. Increased frequency and amplitude of drum strikes was behaviorally correlated with higher arousal and lower valence measures and neurally correlated with temporoparietal junction (TPJ) activation in the listener. Contrast comparisons of drumming greater than talking also revealed neural activity in right TPJ. Together, findings suggest that emotional content communicated by drumming engages right TPJ mechanisms in an emotionally and behaviorally sensitive fashion. Drumming may provide novel, effective clinical approaches for treating social-emotional psychopathology.

Entities:  

Mesh:

Year:  2018        PMID: 30215809      PMCID: PMC6204489          DOI: 10.1093/scan/nsy076

Source DB:  PubMed          Journal:  Soc Cogn Affect Neurosci        ISSN: 1749-5016            Impact factor:   3.436


Introduction

Research on the communication of emotion has focused primarily on the neural mechanisms of communicating via speech, despite the fact that emotion is also communicated through nonverbal modalities ranging from music to body language. As these modalities are often used instead of or in addition to speech, we hypothesize that they offer something unique or supplemental that merits investigation and may have unique clinical application. Further, the communication of emotion is a bidirectional process, which includes both sensitivity to the emotional cues of others as well as the expression of internal emotional states to others. Generally, emotion research has focused on the unidirectional process of perception or induction (i.e. the brain’s reactivity to emotional stimuli), in large part due to the limitation of conventional neuroimaging modalities to single subjects. This study utilized a simultaneous, dual-brain neuroimaging paradigm to study bidirectional communication of emotion, including sending and receiving emotional content. In particular, we investigated the neural correlates of communicating emotion via drumming and listening, as compared to talking and listening. Drumming is an ancient nonverbal form of communication that has been used across the world throughout history, with some of the earliest drums dating back to 5500–2350 BCE in China (Liu, 2005). They have typically played a communicative role in social settings, which may point to their evolutionary origin (Randall, 2001; Remedios et al., 2009). For example, slit drums and slit gongs—used from the Amazon to Nigeria to Indonesia—often use particular tonal patterns to convey messages over distance (Stern, 1957). Similarly, the renowned ‘talking drums’ of West Africa convey information with stereotyped stock phrases that mimic the sounds and patterns of speech (Stern, 1957; Carrington, 1971; Arhine, 2009; Oluga and Babalola, 2012). These drums have been used to communicate messages over thousands of kilometers by relaying from village to village (Gleick, 2011). However, these drums are also used in other communal settings, including dancing, rituals, story-telling and other ceremonies (Carrington, 1971; Ong, 1977), suggesting that they hold not only semantic information but also emotional information. Drums are typically used in social or ceremonial settings, without holding semantic information and without intention to mimic speech. Such drumming serves various functions of emotional communication, such as to instill motivation or fear (e.g. during war), synchronize group activity (e.g. agricultural work or marching) or build social cohesion (recreational and ceremonial drum circles). For example, Wolf (2000) explores how drums are used among South Asian Shi’i Muslims in the mourning process that commemorates the killing of a political and spiritual leader at the Battle of Karbala in 680 CE. He points to how various qualities of drumming may facilitate the listener’s emotional relationship with the event (e.g. slow tempo for sadness, loud drum strikes for the intensity of grief). The use of drums in situations where verbal communication is also used suggests that drumming provides added value to verbal communication, possibly deepening the emotional experience. In this study, we aim to understand the putative neural mechanisms that underlie bidirectional communication of emotion via drums. Functional near-infrared spectroscopy (fNIRS) uses signals based on hemodynamic responses, similar to functional magnetic resonance imaging (fMRI), and with distinct features that are beneficial for the study of communication. For example, the signal detectors are head-mounted (i.e. non-invasive caps), participants can be seated directly across from each other while being simultaneously recorded, the system is virtually silent and data acquisition is tolerant of limited head motion (Eggebrecht et al., 2014). These features facilitate a more ecologically valid neuroscience of dual communicating brains. The fNIRS hemodynamic signals are based on differential absorption of light at wavelengths sensitive to concentrations of oxygenated and deoxygenated hemoglobin (deoxyHb) in the blood. As with fMRI, observed variation in blood oxygenation serves as a proxy for neural activity (Kato, 2004; Ferrari and Quaresima, 2012; Scholkmann et al., 2013; Gagnon et al., 2014). Although it is well documented that blood oxygen level–dependent (BOLD) signals acquired by fMRI and hemodynamic signals acquired by fNIRS are highly correlated (Strangman et al., 2002; Sato et al., 2013), the acquired signals are not identical. The fNIRS system acquires both oxygenated hemoglobin and deoxyHb. The deoxyHb signal most closely resembles the fMRI signal (Sato et al., 2013; Zhang ) and is thus reported here. The fNIRS signals originate from larger volumes than fMRI signals, limiting spatial resolution (Eggebrecht et al., 2012). However, the fNIRS signal is acquired at a higher temporal resolution than the fMRI signal (20–30 ms vs 1.0–1.5 s), which benefits dynamic studies of functional connectivity (Xu Cui et al., 2011). Due to the mechanics of emitting infrared light, the sensitivity of fNIRS is limited to cortical structures within 2–3 cm from the skull surface. Although fNIRS is well established (particularly in child research where fMRI approaches remain difficult or contraindicated), recent adaptations of fNIRS for hyperscanning enable significant advances in the neuroimaging of neural events that underlie interactive and social functions in adults (Funane et al., 2011; Dommer et al., 2012; Holper et al., 2012; Cheng et al., 2015; Jiang Jiang et al., 2015; Osaka et al., 2015; Vanutelli et al., 2015; Hirsch et al., 2017; Piva et al., 2017). Study Overview. We investigated the neural response to the communication of emotional content of visual stimuli via drumming in a face-to-face paradigm, eliciting the contribution of drumming to auditory emotion communication. We chose drumming as the communication modality given its accessibility to a first-time user, thus enhancing the potential for clinical application. In our experimental paradigm, pairs of subjects (dyads) were presented with images from the International Affective Picture System (IAPS; Lang et al., 2008), which served as the topic for communication via either drumming or talking. We identified the relationship between drumming behavior and arousal or valence of the IAPS images. We then evaluated the neural sensitivity to drum behavior in both drummer and listener in order to characterize the communication of emotion. Finally, we compared the neural correlates of drumming and talking to evaluate the possible unique neural contribution of drumming over talking as a communication modality. The IAPS images employ a two-dimensional emotion framework that distinguishes the emotional qualities of arousal and valence, as most classically represented in Russell’s circumplex model (1980). Arousal is a measure of the intensity or activating capacity of an emotion, ranging from low (calm) to high (excited); low arousal emotions may include sadness or contentedness, while high arousal emotions may include excitement or anger. Valence is a measure of the pleasure quality of the emotion, ranging from negative (unpleasant) to positive (pleasant); negative valence emotions may include sadness or anger, while positive valence emotions may include contentedness or excitement. Each of the IAPS images is indexed by valence and arousal ratings, allowing us to regress neural activity against these emotional qualities (Lang et al., 2008). The hypothesis for this investigation was 2-fold. First, we hypothesized that strike-by-strike drum measures that communicate expressions of valence and arousal would elicit activity in brain systems associated with social and emotional functions, i.e. the right temporoparietal junction (TPJ). Second, we hypothesized that drumming in response to emotional stimuli would elicit neural activity that was greater than talking in the TPJ.

Methods

Participants

Thirty-six (36) healthy adults [18 pairs of subjects; mean age, 23.8 ± 3.2; 86% right handed (Oldfield, 1971)] participated in the study. Sample size is based on power analyses of similar prior two-person experiments showing that a power of 0.8 is achieved by an n of 31 (Hirsch et al., 2017). All participants provided written informed consent in accordance with guidelines approved by the Yale University Human Investigation Committee (HIC #1501015178) and were reimbursed for participation. Dyads were assigned in order of recruitment; participants were not stratified further by affiliation or dyad gender mix. Participants rated their familiarity with their partner, their general musical expertise and their drumming expertise (descriptive statistics in Table 1). To facilitate drumming as a method of communication for participants regardless of previous experience, a brief interactive video tutorial was shown to all participants to acquaint them with various ways of striking the drum using both hands.
Table 1

Demographic Information

CategorySubcategoryTotal/Avg
N36
Age23.8 ± 3.2
Gender
Male17
Female19
Other0
Race
Asian/Pacific Islander17
Black/African American2
Latin/Hispanic0
Middle East/N African1
Native/indigenous0
White/European10
Biracial/multiracial7
Other2
Dyad gender mix
Male/male5
Male/female8
Female/female5
Handedness
Right31
Left5
Music expertise*3.14 ± 1.22
Drum expertise*1.64 ± 0.93
Partner familiarity*1.53 ± 1.23

*Based on Likert Scale responses ranging from 1 to 5 for musical expertise (never played to plays professionally), drumming expertise (never played to plays professionally) and partner familiarity (never seen or spoken to best friends).

Demographic Information *Based on Likert Scale responses ranging from 1 to 5 for musical expertise (never played to plays professionally), drumming expertise (never played to plays professionally) and partner familiarity (never seen or spoken to best friends). Table 1 includes demographic information for subjects and dyads, as well as participant characteristics regarding musical expertise, drum expertise and familiarity with experiment partner.

Experimental paradigm

Dyads were positioned face to face across a table 140 cm from each other (Figure 1). Pseudo-randomized image stimuli presented on each trial were selected from a subset of the IAPS (Lang et al., 2008). These images were presented to both participants via a monitor on each side of the table that did not obstruct view of their partner. In each trial, one subject responded to the image stimulus by drumming or talking while the other listened.
Fig. 1

Experimental paradigm. Each run is 3 min, 12 epochs (8 epochs shown here). Subjects alternate ‘sending’ (speaking or drumming) and ‘receiving’ when triggered by image change. Each image was selected from the IAPS library with established arousal and valence ratings.

Experimental paradigm. Each run is 3 min, 12 epochs (8 epochs shown here). Subjects alternate ‘sending’ (speaking or drumming) and ‘receiving’ when triggered by image change. Each image was selected from the IAPS library with established arousal and valence ratings. In the drumming condition, participants were encouraged to respond to the image however they felt appropriate, including a direct response to the emotional content of the image, drumming as if they were acting within the image (e.g. with punches or strokes), or drumming as if creating the soundtrack to the image. In the talking condition, participants were encouraged to speak about what they see, their experience with the elements of the image, their opinion about the image or elements within it or what came to mind in response to the image. The images changed and roles alternated between ‘sending’ (drumming or speaking) and ‘receiving’ (listening to partner) every 15 s for 3 min (Figure 2). For example, as illustrated in Figure 2 Event 1, after Subject 1 had spoken about the space shuttle liftoff for 15 s while Subject 2 listened, an image of flowers (Figure 2, Event 2) replaced the space shuttle image on both subjects’ screens, cuing Subject 2 to speak about this new image while Subject 1 listened. This 3-min run of alternating ‘sending’ and ‘receiving’ every 15 s thus totals 12 epochs. This was then repeated for a total of two runs of drumming and two runs of talking for each pair of subjects.
Fig. 2

Experimental set-up for two interacting partners in the drumming condition. The talking communication condition was identical, but without the drum apparatus.

Experimental set-up for two interacting partners in the drumming condition. The talking communication condition was identical, but without the drum apparatus. For each dyad, the following conditions were pseudo-randomized: order of experiment runs (i.e. dialogue runs first or drumming runs first), order of subjects responding within runs and order of subjects responding across runs. The order of presentation of the series of 96 images for each experiment was also randomized.

Image stimuli

The stimuli used for each experiment were a set of 96 images selected from the IAPS (Lang et al., 2008). These images have established ratings for arousal (low to high) and valence (negative to positive) on a 1–9 Likert scale. Examples are given in Figure 3, and a scatterplot depicting the valence and arousal distributions of our image subset is included in Figure S1(Supplementary Material). The library numbers of these images and their relevant statistics can be found in the appendix.
Fig. 3

Examples of IAPS images with low/high arousal (A) and negative/positive valence (V). The figure illustrates the arousal/valence index system for emotional qualities of each image.

Examples of IAPS images with low/high arousal (A) and negative/positive valence (V). The figure illustrates the arousal/valence index system for emotional qualities of each image.

Quantified drumming response

The electronic drums utilized for this study use a Musical Instrument Digital Interface (MIDI) protocol to record quantity and force of drum strikes. For each run, we collected strike-by-strike information, including the average force of drum strikes, the total number of drum strikes and the product of these two values (providing a combined objective quantification of drumming response). This quantified drumming response was then correlated with the established arousal and valence ratings for image stimuli, serving as a behavioral measure of responses to IAPS images. Strike-by-strike measures were taken as the ‘sending’ variable.

Signal acquisition and processing

NIRS signal acquisition, optode localization and signal processing including global mean removal used here are similar to standard methods described previously for the deoxyHb signal (Noah et al., 2015; Zhang et al., 2016; Dravida et al., 2017; Hirsch et al., 2017; Noah et al., 2017; Zhang et al., 2017). Hemodynamic signals were acquired using a 64-fiber (84-channel) continuous-wave fNIRS system (Shimadzu LABNIRS, Kyoto, Japan). The cap and optode layout of the system provided extended head coverage for both participants achieved by distribution of 42 3-cm channels over both hemispheres of the scalp (Figure 4). Anatomical locations of optodes in relation to standard head landmarks were determined for each participant using a Patriot 3D Digitizer (Polhemus, Colchester, VT) (Okamoto and Dan, 2005; Singh et al., 2005; Eggebrecht et al., 2012; Ferradal et al., 2014). The Montreal Neurological Institute (MNI) coordinates (Mazziotta et al., 2001) for each channel were obtained using NIRS-SPM software (Ye et al., 2009), and the corresponding anatomical locations of each channel were determined by the provided atlas (Rorden and Brett, 2000). Table S1 (Supplementary Material) lists the median MNI coordinates and anatomical regions with probability estimates for each of the channels shown in Figure 4.
Fig. 4

Right and left hemispheres of a single-rendered brain illustrate average locations (red circles) for channel centroids. See Table S1 (Supplementary Material) for average MNI coordinates and anatomical locations.

Right and left hemispheres of a single-rendered brain illustrate average locations (red circles) for channel centroids. See Table S1 (Supplementary Material) for average MNI coordinates and anatomical locations. We applied pre-coloring in our experiment through high-pass filtering. Pre-whitening was not applied to our data. This decision was guided by a previous report showing a detrimental effect on neural responses during a finger-thumb-tapping task (Ye et al., 2009). Baseline drift was modeled based on the time series and removed using wavelet detrending provided in NIRS-SPM. Global components resulting from systemic effects such as blood pressure (Tachtsidis and Scholkmann, 2016) were removed using a principal component analysis spatial filter (Zhang et al., 2016) prior to general linear model (GLM) analysis. Comparisons between conditions were based on the GLM (Penny et al., 2011). Event epochs were convolved with a standard hemodynamic response function modeled to the contrast between ‘sending’ (drumming or talking) and ‘receiving’ (listening), providing individual beta values of the difference for each participant across conditions. Group results were rendered on a standardized MNI brain template.

Results

Drumming related to arousal and valence

Pearson product–moment correlations were determined between the established arousal ratings for each image and the quantified behavioral measure of drumming response. We observed a positive correlation (r = 0.37), indicating that drumming responses increased with more arousing image stimuli (Figure 5A). Pearson product–moment correlations were also determined between the established valence ratings for each image and the quantified drumming response. We observed a negative correlation (r = −0.22), indicating that drumming responses decreased with more positive-mood image stimuli (Figure 5B).
Fig. 5

A, A positive correlation (r = 0.37) was observed between the quantified drumming response (number of drum strikes multiplied by average drum strike force) and the arousal ratings of IAPS image stimuli. The bars represent two brackets equally dividing our range of IAPS image stimuli arousal ratings (lowest arousal 2.63 to highest arousal 7.35), P < 0.001. B, A negative correlation (r = −0.22) was observed between the quantified drumming response (number of drum strikes multiplied by average drum strike force) and the valence ratings of IAPS image stimuli. The bars represent two brackets equally dividing our range of IAPS image stimuli valence ratings (lowest valence 2.16 to highest valence 8.34), P < 0.001.

A, A positive correlation (r = 0.37) was observed between the quantified drumming response (number of drum strikes multiplied by average drum strike force) and the arousal ratings of IAPS image stimuli. The bars represent two brackets equally dividing our range of IAPS image stimuli arousal ratings (lowest arousal 2.63 to highest arousal 7.35), P < 0.001. B, A negative correlation (r = −0.22) was observed between the quantified drumming response (number of drum strikes multiplied by average drum strike force) and the valence ratings of IAPS image stimuli. The bars represent two brackets equally dividing our range of IAPS image stimuli valence ratings (lowest valence 2.16 to highest valence 8.34), P < 0.001.

Neural responses to drumming (sending) and listening (receiving)

In this contrast comparison of drumming (‘sending’) and listening (‘receiving’), we convolved the strike-by-strike drum intensities with the hemodynamic response function of the block (Figure 6). Contrast comparisons of listening > drumming (blue) show activity in the right hemisphere that correlates with greater amplitude and frequency of drum response, including supramarginal gyrus (SMG) (BA40), superior temporal gyrus (STG) (BA22), angular gyrus (BA39), STG (BA22) and middle temporal gyrus (BA21). These regions are included in the TPJ. On the other hand, comparisons of drumming > listening (red) show activity in two clusters, one in each hemisphere, that correlate with greater amplitude and frequency of drum response. The cluster on the right hemisphere has a spatial distribution including pre-motor and supplementary motor (BA6) and primary somatosensory cortex (BA1,2,3). The cluster on the left has a spatial distribution including pre-motor and supplementary motor cortex (BA6). Together, they are labeled Sensory Motor Cortex (SMC).
Fig. 6

Convolving strike by strike drumming intensities with the hemodynamic response function for the drumming (‘sending’) block, the listening condition shows greater activity than the drumming condition in two loci (blue), both in the right hemisphere. The first peak voxel was located at 64, −52, 24 (T = −3.49, P < 0.00078, P < 0.05 FDR corrected), and it included SMG (BA40) 49%, STG (BA22) 35% and angular gyrus (BA39) 16%. The second peak voxel was located at 64, −46, 6 (T = −3.84, P < 0.00031, P < 0.05 FDR corrected), and it included STG (BA22) 56% and middle temporal gyrus (BA21) 40%. In contrast, the drumming (‘sending’) condition shows greater activity than the listening (‘receiving’) condition in two loci (red), one in each hemisphere. The right hemisphere peak voxel was located at 60, −16, 42 (T = 3.58, P < 0.00061, P < 0.05 FDR corrected), and it included pre-motor and supplementary motor cortex (BA6) 43% and primary somatosensory cortex (BA 1, 2, 3) 18%, 12%, 17%. The left hemisphere peak voxel was located at −50, −6, 36 (T = 3.11, P < 0.00211), and it included pre-motor and supplementary motor cortex (BA6) 100%.

Convolving strike by strike drumming intensities with the hemodynamic response function for the drumming (‘sending’) block, the listening condition shows greater activity than the drumming condition in two loci (blue), both in the right hemisphere. The first peak voxel was located at 64, −52, 24 (T = −3.49, P < 0.00078, P < 0.05 FDR corrected), and it included SMG (BA40) 49%, STG (BA22) 35% and angular gyrus (BA39) 16%. The second peak voxel was located at 64, −46, 6 (T = −3.84, P < 0.00031, P < 0.05 FDR corrected), and it included STG (BA22) 56% and middle temporal gyrus (BA21) 40%. In contrast, the drumming (‘sending’) condition shows greater activity than the listening (‘receiving’) condition in two loci (red), one in each hemisphere. The right hemisphere peak voxel was located at 60, −16, 42 (T = 3.58, P < 0.00061, P < 0.05 FDR corrected), and it included pre-motor and supplementary motor cortex (BA6) 43% and primary somatosensory cortex (BA 1, 2, 3) 18%, 12%, 17%. The left hemisphere peak voxel was located at −50, −6, 36 (T = 3.11, P < 0.00211), and it included pre-motor and supplementary motor cortex (BA6) 100%.

Comparison of drumming and talking

Contrast comparisons of drumming > talking show both left and right hemisphere activity (Figure 7). The spatial distribution of the right hemisphere cluster included the SMG (BA 40), the STG (BA 22) and the primary somatosensory cortex (BA 2). The spatial distribution of the left hemisphere cluster included the primary somatosensory cortex (BA 2) and the SMG (BA 40).
Fig. 7

Collapsing across qualities of valence and arousal, the drumming condition shows greater activity than the talking condition in two loci, one in each hemisphere, mapped in accordance with the NIRS-SPM atlas (Mazziotta et al., 2001; Tak et al., 2016). The right hemisphere peak voxel was located at 62, −36, 28 (T = 5.32, P < 0.00001, P < 0.05 FDR corrected), and it included SMG (BA 40) 41%, STG (BA 22) 25% and primary somatosensory cortex (BA 2) 21%. The cluster in the left hemisphere had a peak voxel at −66, −30, 30 (T = 3.78, P = 0.00030, P < 0.05 FDR corrected), spatial distribution including primary somatosensory cortex (BA 2) 42% (BA 2) and SMG 20% (BA 40).

Collapsing across qualities of valence and arousal, the drumming condition shows greater activity than the talking condition in two loci, one in each hemisphere, mapped in accordance with the NIRS-SPM atlas (Mazziotta et al., 2001; Tak et al., 2016). The right hemisphere peak voxel was located at 62, −36, 28 (T = 5.32, P < 0.00001, P < 0.05 FDR corrected), and it included SMG (BA 40) 41%, STG (BA 22) 25% and primary somatosensory cortex (BA 2) 21%. The cluster in the left hemisphere had a peak voxel at −66, −30, 30 (T = 3.78, P = 0.00030, P < 0.05 FDR corrected), spatial distribution including primary somatosensory cortex (BA 2) 42% (BA 2) and SMG 20% (BA 40).

Discussion

In this study, we aimed to understand the neural mechanisms that underlie the communication of emotional qualities through drumming, a nonverbal auditory mode of communication. Our neuroimaging system using fNIRS and natural interpersonal interaction between dyads enables the study of ecologically valid communication. We hypothesized that strike-by-strike drum measures that communicate expressions of valence and arousal would elicit activity in brain systems associated with social and emotional functions, i.e. the right TPJ. We also hypothesized that drumming in response to emotional stimuli would elicit neural activity that was distinct from or greater than talking in response to the same stimuli in the right TPJ. Using behavioral measures, we identified that increased frequency and amplitude of strike-by-strike drum behavior was positively correlated with image arousal and negatively correlated with image valence. We also found that increased frequency and amplitude of strike-by-strike drum behavior was correlated with sensorimotor activity in the ‘sender’ but TPJ activity in the ‘listener’. Taken together, these findings support the conclusion that communication of emotion via drumming engages the right TPJ and that drumming may communicate both arousal and valence with some preference for arousal. Finally, we observed a greater cortical response in the drumming condition than in the talking condition at the right TPJ, including STG and SMG, suggesting that drumming not only activates this social–emotional brain region, but may have a distinct advantage in activating this area over talking.

Communicating arousal and valence

Specific features of drumming may explain its capacity to communicate arousal and valence, with some preference for arousal. In speech, various prosodic features are known to cue emotion, including loudness, rate, rate variability, pitch contours, and pitch variability (Banse and Scherer, 1996; Koike et al., 1998; Juslin and Laukka, 2003; Ilie and Thompson, 2006). Similarly, such features in music include tempo, mode, melodic range, articulation, loudness and pitch (Gabrielsson and Lindström, 2001; Juslin and Laukka, 2003; Jonghwa Kim and André, 2008; Eerola et al., 2013). Prior studies across both speech and music suggest that cues like articulation, loudness, tempo and rhythm tend to influence arousal, while mode, pitch, harmony and melodic complexity influence valence (Husain et al., 2002; Ilie and Thompson, 2006; Jonghwa Kim and André, 2008; Gabrielsson and Lindström, 2010). Drumming has limited pitch or melodic capacity; on the other hand, cues like tempo, loudness and articulation are easily enacted through drumming, and these have been shown to allow a listener to reliably identify particular emotions via drumming (Laukka and Gabrielsson, 2000).

The importance of the right TPJ

The right TPJ, including the STG and SMG, is well established in its social and emotional function (Carter and Huettel, 2013). In a recent example using dual-brain fNIRS, the right TPJ has been directly implicated in functional connectivity during human-to-human vs human-to-computer competitive interaction (Piva et al., 2017), consistent with dedicated human social function (Hirsch et al., 2018). The superior temporal sulcus and gyrus were an early hypothesized node in the social network (Brothers, 1990), and this was substantiated by later research (Allison et al., 2000; Frith, 2007; Pelphrey and Carter, 2008). For example, this region appears to play a role in interpreting biological motion to attribute intention and goals to others (Allison et al., 2000; Adolphs, 2003), consistent with the Theory of Mind model of the TPJ. The social role of the STG has been further investigated within the context of emotion (Narumoto et al., 2001). Robins et al. (2009) observed increase right STG (rSTG) activation with emotional stimuli; this was especially increased for combined audio-visual stimuli as opposed to either audio or visual stimuli alone, highlighting the importance of the rSTG in processing emotional information in live, natural social interaction. In terms of specifically auditory stimuli, Leitman et al. (2010) also observed greater activity in the posterior STG with increased saliency of emotion-specific acoustic cues in speech, and Plichta et al. (2011) observed auditory cortex activation (within STG) that was modulated by extremes of valence in emotionally salient soundbites. Still more relevant to our investigation, the emotional processing of pleasant and unpleasant music has been lateralized and localized to the rSTG (Zatorre, 1988; Zatorre et al., 1992; Blood et al., 1999). Although drumming does not have the same range of affective cues as other music, our investigation replicates the known sensitivity of rSTG to emotion in music through fewer cues like tempo, loudness and rhythmic characteristics. The SMG, the other region of the TPJ that plays a significant role in our findings, has also been implicated in social and emotional processing. Activity in the SMG has been associated with empathy and understanding the emotions held by others, suggesting a process of internal qualitative representation to facilitate empathy (Lawrence et al., 2006). Further, there is increased SMG activity particularly on the right side, when one’s own mental state is different from the mental state of another person with whom we are empathizing (Silani et al., 2013). The TPJ is relevant from a clinical perspective as well. In particular, the STG has been increasingly studied in patients on the autism spectrum given the deficits of both language and social interaction. Decreased capacity to attribute the mental states of animated objects in autism spectrum disorder has been linked to decreased activation of mentalizing networks, including the STG (Castelli et al., 2002). Many other autism studies have shown abnormalities in the rSTG, both functional (Boddaert and Zilbovicius, 2002) and anatomical (Zilbovicius et al., 1995; Casanova et al., 2002; Jou et al., 2010). Volume loss of rSTG has been noted in those with criminal psychopathy (Müller et al., 2008), perhaps underlying their abnormal emotional responsiveness. Volume increases in rSTG on the other hand have been demonstrated in pediatric general anxiety disorder (De Bellis et al., 2002), in subjects exposed to parental verbal abuse (Tomoda et al., 2011), and in maltreated children and adolescents with post-traumatic stress disorder (PTSD; De Bellis et al., 2002).

Clinical application of drumming: future directions

Music and music therapy have been used in a number of clinical contexts, particularly emotional and behavioral disorders such as schizophrenia (Talwar et al., 2006; Peng et al., 2010), depression (Maratos et al., 2008; Erkkilä et al., 2011) and substance use disorders (Cevasco et al., 2005; Baker et al., 2007). Music therapy is perhaps best known for its utility in autism (Møller et al., 2002; Reschke-Hernández, 2011; Srinivasan and Bhat, 2013), where it has been used to improve emotional and social capacities (Kim et al., 2009; LaGasse, 2014). Given the aforementioned rSTG abnormalities in autism as well as our rSTG results, further research should explore neural correlates and possible neuroplastic effects of music interventions for social and emotional development in autism. Perhaps this may explain the consistent inclusion of drumming in autism music therapy and the special attention paid to rhythmic and motor aspects of music in autism (Wan et al., 2011; Srinivasan and Bhat, 2013). However, while drumming has a number of musical elements and is often a part of group music-making, drumming and music are not identical. While music has been well established to cue both arousal and valence, we demonstrated the capacity for drumming to communicate arousal better than valence. This suggests that drumming interventions may be more effective for psychopathology typically associated with arousal (e.g. anxiety disorders, like PTSD) than for psychopathology typically associated with valence (e.g. mood disorders, like depression). Recent work that used drumming in clinical populations substantiates this hypothesis. Bensimon et al. (2008) found drumming to be an effective intervention for PTSD patients by reducing symptoms, facilitating ‘non-intimidating access to traumatic memories’ and allowing for a regained sense of self-control and for release of anger. In another study, the effectiveness of drumming for substance use disorder was heavily linked to its ability to induce relaxation and ‘release’ emotional trauma (Winkelman, 2003). Interestingly, both of these studies highlighted the effect of drumming on increased sense of belonging, intimacy and connectedness, perhaps a reflection of our own cross-brain coherence findings. Further investigation of drumming in high arousal and high anxiety disorders within a neuroscientific framework could improve specificity and efficacy of treatment of these disorders, particularly within social contexts.

Limitations

Limitations of fNIRS investigations are balanced with advantages that enable dual-brain imaging in live, natural, face-to-face conditions. This study is the first to our knowledge to investigate the neural correlates of nonverbal auditory communication of emotion in an ecologically valid setting. One unavoidable limitation is the restriction of fNIRS data acquisition to cortical activity, due to limited penetration of infrared light through the skull. This excludes important limbic and striatal structures, which are known to be active in musical induction and perception of emotion (Blood and Zatorre, 2001; Brown et al., 2004; Koelsch, 2010; Peretz et al., 2013). In terms of our behavioral data, while we noted a correlation between drum response and both arousal and valence, the negative correlation observed between drum response and valence may actually be due to arousal. The arousal and valence distributions of our IAPS subset (Figure S1 Supplementary Materials) indicate a relative lack of images with low valence and low arousal. This bias, which reflects a similar bias in the complete IAPS image set, results in an overemphasis of low valence images with high arousal, potentially mediating the negative correlation observed where drum response increases with lower valence. In the comparison of drumming and talking conditions, we recognize that the elicited region of greater TPJ activation in drumming likely contains some contributory activation from nearby SMC, as expected from a drumming task. That said, the higher probability of TPJ regions noted by our digitizing process, as well as the breadth and significance of the observed neural activity in this area, provides confidence that there is a strong component of TPJ activation in drumming over talking. This speculative result invites further investigation into the utility of drumming over talking as communication modality with clinical application that elicits social–emotional engagement. Finally, our subject population of mostly college students may limit generalizability. In particular, while drum experience was very low, there was a moderate level of averaged musical expertise that may have facilitated subjects’ drum communication of emotion (Methods, Table 1). Further research should replicate these results in both a drum-naïve and music-naïve population. Our study demonstrated the particular contribution of drumming to emotional communication that is associated with activity in the right TPJ. The observed sensitivity of the STG and SMG within the right TPJ during listening, a canonical social and emotion processing center, holds implications for social–emotional psychopathology. Future research on nonverbal auditory communication in clinical contexts, ranging from autism to PTSD, is informed by these findings. Click here for additional data file.
  73 in total

1.  Emotional responses to pleasant and unpleasant music correlate with activity in paralimbic brain regions.

Authors:  A J Blood; R J Zatorre; P Bermudez; A C Evans
Journal:  Nat Neurosci       Date:  1999-04       Impact factor: 24.884

Review 2.  Functional neuroimaging and childhood autism.

Authors:  Nathalie Boddaert; Monica Zilbovicius
Journal:  Pediatr Radiol       Date:  2001-11-13

3.  A quantitative comparison of NIRS and fMRI across multiple cognitive tasks.

Authors:  Xu Cui; Signe Bray; Daniel M Bryant; Gary H Glover; Allan L Reiss
Journal:  Neuroimage       Date:  2010-11-01       Impact factor: 6.556

4.  Leader emergence through interpersonal neural synchronization.

Authors:  Jing Jiang; Chuansheng Chen; Bohan Dai; Guang Shi; Guosheng Ding; Li Liu; Chunming Lu
Journal:  Proc Natl Acad Sci U S A       Date:  2015-03-23       Impact factor: 11.205

5.  Music therapy for in-patients with schizophrenia: exploratory randomised controlled trial.

Authors:  Nakul Talwar; Mike J Crawford; Anna Maratos; Ula Nur; Orii McDermott; Simon Procter
Journal:  Br J Psychiatry       Date:  2006-11       Impact factor: 9.319

Review 6.  Music therapy for depression.

Authors:  A S Maratos; C Gold; X Wang; M J Crawford
Journal:  Cochrane Database Syst Rev       Date:  2008-01-23

7.  Auditory-motor mapping training as an intervention to facilitate speech output in non-verbal children with autism: a proof of concept study.

Authors:  Catherine Y Wan; Loes Bazen; Rebecca Baars; Amanda Libenson; Lauryn Zipse; Jennifer Zuk; Andrea Norton; Gottfried Schlaug
Journal:  PLoS One       Date:  2011-09-29       Impact factor: 3.240

8.  Signal processing of functional NIRS data acquired during overt speaking.

Authors:  Xian Zhang; Jack Adam Noah; Swethasri Dravida; Joy Hirsch
Journal:  Neurophotonics       Date:  2017-09-11       Impact factor: 3.593

9.  Sensor space group analysis for fNIRS data.

Authors:  S Tak; M Uga; G Flandin; I Dan; W D Penny
Journal:  J Neurosci Methods       Date:  2016-03-04       Impact factor: 2.390

10.  Distributed Neural Activity Patterns during Human-to-Human Competition.

Authors:  Matthew Piva; Xian Zhang; J Adam Noah; Steve W C Chang; Joy Hirsch
Journal:  Front Hum Neurosci       Date:  2017-11-23       Impact factor: 3.169

View more
  8 in total

1.  Optical imaging and spectroscopy for the study of the human brain: status report.

Authors:  Hasan Ayaz; Wesley B Baker; Giles Blaney; David A Boas; Heather Bortfeld; Kenneth Brady; Joshua Brake; Sabrina Brigadoi; Erin M Buckley; Stefan A Carp; Robert J Cooper; Kyle R Cowdrick; Joseph P Culver; Ippeita Dan; Hamid Dehghani; Anna Devor; Turgut Durduran; Adam T Eggebrecht; Lauren L Emberson; Qianqian Fang; Sergio Fantini; Maria Angela Franceschini; Jonas B Fischer; Judit Gervain; Joy Hirsch; Keum-Shik Hong; Roarke Horstmeyer; Jana M Kainerstorfer; Tiffany S Ko; Daniel J Licht; Adam Liebert; Robert Luke; Jennifer M Lynch; Jaume Mesquida; Rickson C Mesquita; Noman Naseer; Sergio L Novi; Felipe Orihuela-Espina; Thomas D O'Sullivan; Darcy S Peterka; Antonio Pifferi; Luca Pollonini; Angelo Sassaroli; João Ricardo Sato; Felix Scholkmann; Lorenzo Spinelli; Vivek J Srinivasan; Keith St Lawrence; Ilias Tachtsidis; Yunjie Tong; Alessandro Torricelli; Tara Urner; Heidrun Wabnitz; Martin Wolf; Ursula Wolf; Shiqi Xu; Changhuei Yang; Arjun G Yodh; Meryem A Yücel; Wenjun Zhou
Journal:  Neurophotonics       Date:  2022-08-30       Impact factor: 4.212

2.  Getting into sync: Data-driven analyses reveal patterns of neural coupling that distinguish among different social exchanges.

Authors:  Beáta Špiláková; Daniel J Shaw; Kristína Czekóová; Radek Mareček; Milan Brázdil
Journal:  Hum Brain Mapp       Date:  2019-11-15       Impact factor: 5.038

3.  Comparison of short-channel separation and spatial domain filtering for removal of non-neural components in functional near-infrared spectroscopy signals.

Authors:  J Adam Noah; Xian Zhang; Swethasri Dravida; Courtney DiCocco; Tatsuya Suzuki; Richard N Aslin; Ilias Tachtsidis; Joy Hirsch
Journal:  Neurophotonics       Date:  2021-02-13       Impact factor: 3.593

4.  The integration of social and neural synchrony: a case for ecologically valid research using MEG neuroimaging.

Authors:  Jonathan Levy; Kaisu Lankinen; Maria Hakonen; Ruth Feldman
Journal:  Soc Cogn Affect Neurosci       Date:  2021-01-18       Impact factor: 3.436

5.  Interpersonal Agreement and Disagreement During Face-to-Face Dialogue: An fNIRS Investigation.

Authors:  Joy Hirsch; Mark Tiede; Xian Zhang; J Adam Noah; Alexandre Salama-Manteau; Maurice Biriotti
Journal:  Front Hum Neurosci       Date:  2021-01-13       Impact factor: 3.169

6.  Interpersonal neural synchrony when predicting others' actions during a game of rock-paper-scissors.

Authors:  E Kayhan; T Nguyen; D Matthes; M Langeloh; C Michel; J Jiang; S Hoehl
Journal:  Sci Rep       Date:  2022-07-28       Impact factor: 4.996

7.  Facial and neural mechanisms during interactive disclosure of biographical information.

Authors:  Roser Cañigueral; Xian Zhang; J Adam Noah; Ilias Tachtsidis; Antonia F de C Hamilton; Joy Hirsch
Journal:  Neuroimage       Date:  2020-11-19       Impact factor: 7.400

8.  Neural processes for live pro-social dialogue between dyads with socioeconomic disparity.

Authors:  Olivia Descorbeth; Xian Zhang; J Adam Noah; Joy Hirsch
Journal:  Soc Cogn Affect Neurosci       Date:  2020-10-08       Impact factor: 3.436

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.