Literature DB >> 35277722

Perceptive and affective impairments in emotive eye-region processing in alexithymia.

Zhihao Wang1,2, Katharina S Goerlich2, Pengfei Xu3,4, Yue-Jia Luo1,5,6, André Aleman1,2.   

Abstract

Alexithymia is characterized by impairments in emotion processing, frequently linked to facial expressions of emotion. The eye-region conveys information necessary for emotion processing. It has been demonstrated that alexithymia is associated with reduced attention to the eyes, but little is known regarding the cognitive and electrophysiological mechanisms underlying emotive eye-region processing in alexithymia. Here, we recorded behavioral and electrophysiological responses of individuals with alexithymia (ALEX; n = 25) and individuals without alexithymia (NonALEX; n = 23) while they viewed intact and eyeless faces with angry and sad expressions during a dual-target rapid serial visual presentation task. Results showed different eye-region focuses and differentiating N1 responses between intact and eyeless faces to anger and sadness in NonALEX, but not in ALEX, suggesting deficient perceptual processing of the eye-region in alexithymia. Reduced eye-region focus and smaller differences in frontal alpha asymmetry in response to sadness between intact and eyeless faces were observed in ALEX than NonALEX, indicative of impaired affective processing of the eye-region in alexithymia. These findings highlight perceptual and affective abnormalities of emotive eye-region processing in alexithymia. Our results contribute to understanding the neuropsychopathology of alexithymia and alexithymia-related disorders.
© The Author(s) 2022. Published by Oxford University Press.

Entities:  

Keywords:  N1; alexithymia; eye-region; frontal alpha asymmetry (FAA); two-stage model of facial expression processing

Mesh:

Year:  2022        PMID: 35277722      PMCID: PMC9527467          DOI: 10.1093/scan/nsac013

Source DB:  PubMed          Journal:  Soc Cogn Affect Neurosci        ISSN: 1749-5016            Impact factor:   4.235


Introduction

Alexithymia is a subclinical personality trait characterized by an impaired ability to identify, describe and regulate one’s feelings (Sifneos, 1973; Taylor, 1984; Luminet ). Accounting for 10% in the general population (Honkalampi ), alexithymia is thought to be a transdiagnostic risk factor for various mental disorders. This holds for depression and anxiety (Hendryx ; Li ), autism spectrum disorder (ASD; Bird ), substance abuse disorders (Cruise and Becerra, 2018), posttraumatic stress disorder (Frewen ), somatic symptom disorders (Cerutti ), eating disorders (Marsero ) and psychotic disorders (Van der Velde ), amongst others. There is substantial evidence that difficulties in emotion processing are at the core of alexithymia (Lane ; Swart ; Ihme ; for a recent review, see Luminet ), especially regarding automatic processing of negative stimuli (for details, see Donges and Suslow, 2017). As frequently perceived emotional and social stimuli, facial expressions are widely used to examine emotion processing in alexithymia (for a meta-analysis, see Van der Velde ). The eye-region conveys important information necessary for deciphering emotions (Itier and Batty, 2009). For example, difficulties in emotion recognition in a patient with amygdala damage were shown to result from reduced fixation to the eye-region (Adolphs ). From an evolutionary viewpoint, it has been argued that human eyes have a unique morphology (e.g. the largest size of exposed sclera in the eye outline among primate species) to boost socially affective behaviors such as hunting (Kobayashi and Kohshima, 1997). Of all facial features, the eyes are most important for human face perception (Itier ). Crucially, different facial expressions of emotion are linked to different physical features in the eye-region (Ekman and Friesen, 1971). Among the six basic emotions, the eye-region receives more attention than other features, especially when recognizing anger and sadness (Eisenbarth and Alpers, 2011). Sadness is characterized by a down-looking gaze, and anger by a frowning of the eyebrows (Ekman and Friesen, 1971; Itier and Batty, 2009). The three-stage model of facial expression processing proposed (i) automatic processing for negative-valence facial expression, as indexed by N1 and P1, (ii) valence processing distinguishing emotional and neutral facial expressions, as indexed by N170 and Vertex positive potential (VPP), and (iii) category processing differentiating facial expressions of emotion, as indexed by N3 and P3 (Luo ). Perceptive processing of the eye-region may thus be captured by early event-related potentials (ERPs), including N1 and P1 (for a review, see Calvo ). Despite N1 and P1 reflecting early sensory processing, the discrimination effects of these early components have been shown for facial expressions, demonstrating emotional modulation of the sensory N1 and P1 (Eimer and Holmes, 2002; Dennis ; Luo ; Ellena ), which often arises when people are not required to pay attention to the emotional meaning of stimuli (see Vuilleumier, 2005 for a comprehensive review). In addition to early perceptive processing for angry and sad eye-regions, affective encoding, reflecting approach or withdraw motivations, is further represented in the human brain. Anger and sad facial expressions are distinct motivational emotions, with differences in hemispheric dominance, which can be indexed by frontal alpha asymmetry (FAA; right relative left hemisphere) during rest (Harmon-Jones ; Allen ; Reznik and Allen, 2018). Sad faces eliciting withdrawal tendencies are associated with increased FAA, whereas angry expressions eliciting approach tendencies are associated with decreased FAA (Harmon-Jones ). In addition to ample evidence reporting such associations during the resting state, FAA was recently elicited by erotic compared to neutral stimuli using an event-related task design (Schöne ), validating the representation of approach vs withdrawal processing by the task-related FAA. Moreover, ample evidence showed that the N170 is sensitive to facial expressions (for a meta-analysis, see Hinojosa ). Atypical eye-region processing has long been considered as one of the most significant symptoms in individuals with ASD (for a review, see Senju and Johnson, 2009), but recent evidence attributed deficient eye-region processing in individuals with ASD to alexithymia (Bird ), which is highly comorbid with autism (Cook ). That is, alexithymia, rather than autism, accounts for social-emotional processing difficulties (e.g. deficient eye-region processing) in patients with autism (Bird ; Cook ). Therefore, investigation of eye-region processing in alexithymia is of great importance. So far, there have been two eye-region processing studies (using eye-tracking) in alexithymia. Bird examined gaze patterns in relation to alexithymia during passive viewing of video clips. They observed that alexithymia was negatively correlated with the degree of attention to the eyes, as indexed by eye/mouth ratio of gaze fixations within the face, suggesting reduced eye-fixation in alexithymia. Fujiwara (2018) tested the impact of such reduced eye preference in alexithymia on facial emotion recognition. Reduced eye preference in alexithymia was replicated. Importantly, although individuals with and without alexithymia (ALEX and NonALEX) showed equal accuracy for emotion recognition, increased fixation to the eye-region in ALEX did not improve their performance but in fact increased their errors (in contrast to NonALEX). These findings suggest that looking at the eyes may even be confusing and thus interfere with facial emotion recognition in people with alexithymia. While progress has been made toward scan path analysis of eyes in association with alexithymia (Bird ; Fujiwara, 2018), the cognitive and electrophysiological mechanisms of emotive eye-region processing in alexithymia remain unknown. A meta-analysis of emotion processing studies identified key neural correlates of alexithymia during presentation of negative emotional stimuli in the amygdala and supplementary motor and premotor areas (Van der Velde ). The decreased activation in the amygdala possibly suggests impaired affective encoding (Adolphs ; Van der Velde ), while the diminished response in supplementary motor and premotor areas may point toward deficiency in perceptual processing (Adolphs ; Calvo and Nummenmaa, 2016). Furthermore, FAA was negatively correlated with alexithymia scores in a resting-state EEG study (Flasbeck ). Therefore, it is reasonable to assume that perceptive and affective processing of the emotive eye-region may be compromised in alexithymia. Whereas it is difficult for eye-tracking studies to dissociate perceptual from affective processing, the electroencephalography (EEG) with its high temporal resolution is a sensitive method to record brain responses during different cognitive processes (Luck, 2012). Here, early ERPs (i.e. N1 and P1), N170 and FAA were used to assess perceptual and affective mechanisms, respectively, during emotive eye-region processing in alexithymia (Donges and Suslow, 2017; Luminet ). Intact and eyeless faces with angry and sad expressions were used to probe eye-region processing (Itier ) in a group of individuals without clinically relevant alexithymia levels as assessed with the 20-item Toronto Alexithymia Scale (TAS-20 score ≤51; NonALEX; n = 23) and a group of individuals exceeding the clinically relevant cut-off for alexithymia (score ≥61; ALEX, n = 25). Here, we used a dual-target rapid serial visual presentation (RSVP) task (Raymond ), given that emotion processing has been shown sensitive to limited attentional resources (Luo ; Zhang ). We hypothesized reduced perceptive and affective encoding of eye-region information in alexithymia, which may be more pronounced for sadness given the stronger relevance of the eyes for this expression. We predicted that ALEX individuals would rely less on eye information processing, translating into smaller differences in behavioral performance between intact and eyeless faces compared to NonALEX individuals. Regarding EEG parameters, we predicted reduced early ERPs (i.e. N1 and P1) differences between intact and eyeless faces in ALEX compared to NonALEX individuals. We also hypothesized reduced FAA and N170 differences between intact and eyeless faces with sad expressions in ALEX compared to NonALEX individuals and increased FAA differences between intact and eyeless faces with angry expressions in ALEX compared to NonALEX individuals.

Methods and materials

Participants

Forty-eight healthy adults from a pool of 543 (368 females; age: 17–38 years, mean ± s.d.: 20.03 ± 2.11 years) students at Shenzhen University participated in the experiment. Each participant in the pool completed the Chinese version of the TAS-20 (Bagby ; Zhu ). In light of the international cut-off to assess clinically relevant alexithymia using the TAS-20 (Taylor ), individuals with TAS-20 scores higher or equal to 61 (14.9% of the pool) were identified as individuals with alexithymia (ALEX), while those with TAS-20 scores lower or equal to 51 (54.6% of the pool) were classified as individuals without alexithymia (NonALEX). The final sample consisted of 25 participants in the ALEX group and 23 participants in the NonALEX group (see Table 1 for demographic information and personality characteristics). Please note the final sample was randomly selected from two groups. All participants had normal or corrected-to-normal vision and reported no mental illness in present or past. All participants were compensated for their participation. This study was approved by the Ethics Committee of Shenzhen University, and informed written consent was obtained from all participants.
Table 1.

Demographics and questionnaire scores

ALEX (25; 12 females)NonALEX (23; 12 females)
Mean (s.d.)[min, max]Mean (s.d.)[min, max] t P Cronbach’s alpha
Age19.96 (1.46)[18, 24]19.78 (1.81)[17, 24]0.3760.709
TAS-2066.08 (2.96)[62, 73]36.09 (5.01)[22, 43]24.992<0.0010.856
BDI12.92 (6.36)[3, 27]4.00 (4.68)[0, 17]5.492<0.0010.870
BAI29.28 (5.38)[24, 48]24.04 (3.39)[21, 38]3.992<0.0010.857
AQ124.24 (10.26)[100, 142]109.61 (8.74)[89, 120]5.293<0.0010.673
Demographics and questionnaire scores

Self-report questionnaires

The TAS-20 is widely used to measure alexithymia (Bagby ). Based on self-report, each of 20 items is rated on a 5-point Likert scale ranging from 1 (‘strongly disagree’) to 5 (‘strongly agree’), with five items being negatively scored. For analysis, the negatively keyed items are reverted. The total score is calculated as the sum of all items. High scores represent high levels of alexithymia. The Chinese version of the TAS-20 has been established with acceptable reliability and validity (Zhu ). To control for potential confounding effects of depression, anxiety and autism, participants also completed the Beck Anxiety Inventory (BAI; Beck ), the Beck Depression Inventory (BDI; Beck, 1967) and the Autism Spectrum Quotient (AQ; Baron-Cohen ).

Statistical power

The sample size of 48 participants (Table 1) was determined based on a medium effect size by G*power (version: 3.1; Faul ). Twenty-three participants per group were needed to detect a reliable effect [Cohen’s f = 0.25, α = 0.05, 1 – β = 0.9, repeated-measures analysis of variance (ANOVA), within-between interaction; Faul ].

Picture stimuli

The design of the stimuli was inspired by Itier who used blurring of certain areas of the face (including the eye region) to test reliance on regional processing within the face. Intact and eyeless faces we used as stimuli in the current study were consistent with previous studies testing reliance on eye-region processing within the face (Itier ; Nemrodov and Itier, 2011). Materials consisted of 3 upright house stimuli, 12 scrambled faces (SFs), 28 intact faces (14 angry and 14 sad) from the Taiwanese Facial Expression Image Database (Chen and Yen, 2007) and the corresponding 28 eyeless faces (14 angry and 14 sad). These intact facial expressions were captured from 14 models (7 females) with matched emotional properties between anger and sadness. We controlled for intensity and recognition rate between angry and sad intact faces when selecting stimuli from the database (Chen and Yen, 2007). Consequently, paired-sample t tests in recognition rate and intensity showed no significant differences between angry and sad faces [recognition rate: t(13) = 1.079, P = 0.300, angry: mean ± s.d. = 83.59 ± 9.02, sad: 80.01 ± 8.05; intensity: t(13) = −1.382, P = 0.190, angry: 3.77 ± 0.55, sad: 4.07 ± 0.64], suggesting that angry and sad faces were matched in these emotional dimensions. For intensity rating, participants were asked to rate the intensity of emotion expressed. The 9-point scale represented is as follows: 0 for ‘not at all’, 4 for ‘moderate’ and 8 for ‘high’. We used Adobe Photoshop CS5 to create eyeless faces and scrambled faces based on the intact faces. Within intact faces, we erased the eye region and filled it with skin-like texture to produce eyeless faces (Figure 1B; Itier ). We also randomly swapped the intact faces to produce SFs. Twelve SFs were randomly selected, with half being angry expressions. Please note that SFs had the same rectangular shape, size, luminance and spatial frequency as the emotional pictures (adjusted by Adobe Photoshop CS5). The viewing angle was 6 × 3.38°. All materials were gray-scaled and displayed in the center of the screen.
Fig. 1.

Experimental design of the dual-target RSVP paradigm (A) and examples of stimuli used in the current study (B).

Experimental design of the dual-target RSVP paradigm (A) and examples of stimuli used in the current study (B).

Task and procedure

We adopted the dual-target RSVP paradigm (Figure 1A), with the core frame of two targets (T1 and T2), two questions (Q1 and Q2) and a stimulus-onset asynchrony of 300 ms between two targets. This paradigm has been widely used and shown to be sensitive to detect the time course of emotion processing under limited attentional resources (due to the attentional blink phenomenon elicited by T1; Luo ; Zhang ). At the beginning of each trial, following a white fixation cross of 500 ms, a blue fixation cross appeared in the center of the screen. Then, 12 pictures of SFs, 1 house stimulus and 1 emotional picture were displayed, with each picture lasting 100 ms. In line with previous studies using dual-target RSVP to examine emotion processing, the T1 showed one of three upright houses with the same occurrence probability, appearing at the fourth, fifth, sixth or seventh position of the pictures series, randomly and equiprobably (Luo ; Zhang ). The T2 displayed pseudo-randomly one of five types of pictures (intact angry faces, intact sad faces, eyeless angry faces, eyeless sad faces and blank) 300 ms after the onset of T1. Please note that the blank was used as the baseline for the other four conditions to eliminate superposed electrical activity elicited by T1 and to thus obtain a pure emotional effect elicited by T2 (Sergent ; Luo ; Zhang ). After each picture series, participants were asked to respond to Q1 and Q2 as accurate as possible. Q1 and Q2 were presented in a fixed order without reaction time limitation. The Q1 was ‘Which house was presented in T1’ (press Key ‘1’, ‘2’ or ‘3’ to match the house presented before). The Q2 asked participants to judge the gender of T2 or whether a blank occurred in T2 (press Key ‘1’ if the gender of T2 presented before was male; press Key ‘3’ if the gender of T2 presented before was female; press Key ‘2’ if the blank occurred in T2). The male and the female stimuli were equal in number for each category. The questions disappeared once answered. Note that the gender discrimination task with angry and sad expressions was used to detect implicit emotion effects given that we are more likely to attribute anger to male and sadness to female individuals (Schirmer, 2013). Therefore, performance on the gender discrimination task suggests implicit emotion processing for each expression. At the end of each trial, a black screen appeared for 100 ms. The experiment included 350 trials (70 trials per block) and 70 trials per condition (each emotional picture was repeated five times). Participants completed several practice rounds before the formal experiment started. All experimental procedures were presented using E-prime 2.0 (Psychology Software Tools Inc., Pittsburgh, PA, USA).

EEG recording and preprocessing

We recorded EEG data from a 64-electrode scalp cap according to the international 10-20 system (Brain Products, Munich, Germany), with the reference to the channel FCz. The electrooculogram (EOG; vertical) was recorded with electrodes placed below the right eye. Electrode impedances of EEG and EOG were maintained <5 kΩ. All electrodes were amplified using a 0.01 online high-pass filter and continuously sampled at 1000 Hz per channel for offline analysis. EEG data were preprocessed with EEGLAB 14.1.2b (Delorme and Makeig, 2004) in Matlab 2014b (MathWorks Inc). It comprised the following steps: (I) resampling to 250 Hz; (ii) low-pass filtering of 30 Hz by FIR filter with 7.5 Hz transition band width; (iii) epoching from 500 ms before to 1000 ms after the T2 onset; (iv) baseline correction (−200 to 0 ms); (v) manually rejecting salient muscle epochs and bad channels (if any); (vi) Independent Component Analysis; (vii) visually inspecting and rejecting artifact components (horizontal and vertical eye movements and muscle components); (viii) interpolating bad channels (if any); (ix) re-referencing offline to the average of all electrodes and (x) rejecting trials in which EEG voltages were out of range [−80, 80] μV. Please note that the minimum number of trials in each condition was not less than 65.

Behavioral statistics

We used SPSS 17.0 to perform statistical analyses, with the significance level at P = 0.05. In line with previous studies using the dual-target RSVP paradigm to effectively elicit the attentional blink (Luo ; Zhang ), we defined accuracy as both T1 and T2 being correct for the behavioral index. In line with previous studies assessing the role of the eye-region (Bird ; Fujiwara, 2018), eye-region focus was defined as the difference between intact and eyeless faces (intact–eyeless). This subtraction explained our hypotheses better given that the aim of the current study was to specifically examine the role of the eye-region in emotion processing in alexithymia. We also conducted full ANOVA models with intact and eyeless conditions as a separate factor to test the driving effect for eye-region processing deficits in the supplementary materials). Given the response bias to attribute anger to male and sadness to female individuals (Schirmer, 2013), the factor gender of the presented faces was added to the behavioral statistics. Eye-region focus was subjected to a three-way ANOVA with Group (ALEX/NonALEX) as a between-subject factor, Emotion (anger/sadness) and Gender (female/male) as within-subject factors.

ERP statistics

Following the segmentation from 200 ms before to 800 ms after T2 onset, all clean trials were included for statistical analyses (not only correct trials because of the implicit nature of the emotion processing task employed in this study). We included trials with both correct and incorrect responses to T1 due to the resource sharing account of the attentional blink (Shapiro , 2006): it has been proposed that T1 and T2 compete for limited attentional resources during dual-target situations. Given that gender judgment was more difficult for intact than eyeless faces in T2, it was better to include all clean trials in statistical analysis. ERP analysis focused on the frontal-central N1 (for details, see Supplement). Visual detection on the grand-averaged waveform and its topography confirmed the N1 time window (Figure 3A). The N1 was identified in a window of 110–160 ms over frontal-central electrodes (FCz, Cz, FC1, FC2, C1, C2, FC3, FC4, C3, C4; Vogel and Luck, 2000; Luo ). T2-locked average waveforms for the four conditions (Intact anger, Intact sadness, Eyeless anger and Eyeless sadness) were computed separately for each participant as differences between each condition and the blank condition to eliminate superposed electrical activity elicited by T1 and to thus obtain a pure emotional effect elicited by T2 (Sergent ; Luo ; Zhang ). N1 differences between intact and eyeless faces were calculated using a 2 (ALEX/NonALEX) by 2 (anger/sadness) repeated-measures ANOVA. As suggested by Calvo and Nummenmaa (2016), both N1 and N170 reflect perceptive processing. Given that participants were looking at faces, we thus also analyzed the posterior/occipital N170 (P7, P8, PO7, PO8; 225 ∼ 275 ms; Calvo and Nummenmaa, 2016). N170 differences between intact and eyeless faces were also calculated using a 2 (ALEX/NonALEX) by 2 (anger/sadness) repeated-measures ANOVA.
Fig. 3.

Electrophysiological results. (A) Time course at Cz electrode for each condition at the onset of T2 and topographic maps in N1 (100 ∼ 160 ms) of emotional differences in NonALEX relative to ALEX. Electrodes marked with enlarged white dots were used to evaluate amplitudes of N1. (B) The N1 result. (C) The FAA results. Note the horizontal white lines represent the mean value of each group. *P < 0.05, ∼P < 0.1.

Frontal alpha asymmetry statistics

Time–frequency distributions of each clean trial were computed by a short-time Fourier transform. With a hanning window of 250 ms and the method of detrend, we computed power for each point at the time domain (−500 to 1000 ms; steps of 4 ms) and frequency domain (1 to 30 Hz; steps of 1 Hz). T2-locked average time–frequency power was normalized [(task − baseline)/baseline]. In this equation, task referred to each time–frequency point after T2 onset, while baseline referred to the mean time–frequency point from −500 to −300 ms before T2 onset. Recent evidence showed that alpha signals can be reliably detected with these parameters (Fang ; Wang ). Again, power under each focused stimulus presentation condition was computed separately for each participant as differences between focused conditions and the blank condition. FAA scores were defined as the difference of oscillations between right and left hemisphere [(right − left; (F4 + F6) − (F3 + F5)] within [8 12] Hz. Regarding time domain, the 1000 ms stage of T2 processing was collapsed into 10 time-windows with a duration of 100 ms each. During each stage, we then conducted two-way repeated-measures ANOVAs for FAA differences between intact and eyeless faces in each time-window of 100 ms, with Emotion (anger/sadness) as a within-subject variable and Group (ALEX/NonALEX) as a between-subject variable. Multiple comparisons were corrected by false discovery rate (FDR) with the significance level at P = 0.05. Note that gender in the presented faces was not added as a variable in the N1 and FAA statistics because no significant gender-related group effects were found at the behavioral level (see the ‘Results’ section).

Results

Behavioral results

One-sample t tests showed that accuracy in each condition (Intact anger, Intact sadness, Eyeless anger and Eyeless sadness) was significantly higher than 0.5, as well as higher than for the blank condition [ts(47) > 4.854, Ps < 0.001, Cohen’s d > 0.701], suggesting sufficient ability and engagement in the task. See Table 2 for descriptive data. For eye-region focus, the three-way ANOVA revealed a significant main effect of Gender [F(1,46) = 11.578, P = 0.001, ηp2 = 0.120, male > female, 95% CI: 0.1074, 0. 287]. Importantly, we found a significant interaction effect between Gender and Emotion [F(1,46) = 29.548, P < 0.001, ηp2 = 0.391; Figure 2A]. Followed up by simple effect analyses, angry expressions were easier to identify than sad expressions in male faces [F(1,46) = 15.848, P < 0.001, ηp2 = 0.256, 95% CI: 0.036, 0.111; anger: 0.421 ± 0.030, sadness: 0.347 ± 0.026], whereas sad expressions were easier to identify than angry expressions in female faces [F(1,46) = 21.310, P < 0.001, ηp2 = 0.317, 95% CI: 0.064, 0.162; anger: 0.147 ± 0.033, sadness: 0.260 ± 0.033], confirming the previously observed response bias and suggesting that performance on the current task reflects implicit emotion processing. Regarding alexithymia, we observed a significant interaction effect between Group and Emotion [F(1,46) = 6.287, P = 0.016, ηp2 = 0.120; Figure 2B]. Simple effect analysis revealed that there was no significant differences between anger and sadness in ALEX [F(1,46) = 0.558, ηp2 = 0.012, 95% CI: −0.023, 0.051; anger: 0.283 ± 0.017, sadness: 0.269 ± 0.017], whereas NonALEX did [F(1,46) = 7.606, P = 0.008, ηp2 = 0.142, 95% CI: −0.091, −0.014; anger: 0.285 ± 0.018, sadness: 0.338 ± 0.018]. Simple effect analyses of the interaction effect between Group and Emotion further indicated that the reduced eye-region focuses in ALEX occurred only when perceiving sad expressions [F(1,46) = 7.521, P = 0.009, ηp2 = 0.140, 95% CI: −0.119, −0.018], but not angry expressions [F(1,46) = 0.008, ηp2 < 0.001, 95% CI: −0.052, 0.047]. No other significant main effect was found (all Ps > 0.147).
Table 2.

Behavioral accuracy and electrophysiological responses in each experimental condition of each group

EyeEmotionGroupAccuracyN1 (μV)FAA
IntactAngerALEX0.839 (0.098)−0.653 (0.615)0.514 (1.196)
NonALEX0.868 (0.060)−0.727 (0.609)0.267 (1.291)
SadnessALEX0.846 (0.086)−0.620 (0.584)0.695 (1.074)
NonALEX0.880 (0.059)−0.580 (0.622)0.589 (1.538)
EyelessAngerALEX0.557 (0.075)−0.556 (0.574)0.309 (1.128)
NonALEX0.583 (0.064)−0.399 (0.709)0.725 (1.372)
SadnessALEX0.577 (0.079)−0.448 (0.478)0.528 (1.302)
NonALEX0.542 (0.090)−0.609 (0.605)−0.327 (1.241)

Descriptive data are presented as mean (s.d.). Accuracy represents accuracy that both T1 and T2 are correct.

Fig. 2.

Behavioral results. (A) Interaction effect between Gender presented in the stimuli and Emotion. (B) Interaction effect between Emotion and Group. *P < 0.05.

Behavioral accuracy and electrophysiological responses in each experimental condition of each group Descriptive data are presented as mean (s.d.). Accuracy represents accuracy that both T1 and T2 are correct. Behavioral results. (A) Interaction effect between Gender presented in the stimuli and Emotion. (B) Interaction effect between Emotion and Group. *P < 0.05. Electrophysiological results. (A) Time course at Cz electrode for each condition at the onset of T2 and topographic maps in N1 (100 ∼ 160 ms) of emotional differences in NonALEX relative to ALEX. Electrodes marked with enlarged white dots were used to evaluate amplitudes of N1. (B) The N1 result. (C) The FAA results. Note the horizontal white lines represent the mean value of each group. *P < 0.05, ∼P < 0.1.

N1 amplitude results

With respect to N1 amplitude differences between intact and eyeless faces, the two-way ANOVA revealed a significant interaction effect between Emotion and Group [F(1,46) = 5.336, P = 0.025, ηp2 = 0.104]. Simple effect analyses revealed that there was no significant difference between anger and sadness in ALEX [F(1,46) = 0.335, ηp2 = 0.007, 95% CI: −0.185, 0.335], while in NonALEX, N1 amplitudes were significantly higher for angry than for sadness [F(1,46) = 6.997, P = 0.011, ηp2 = 0.132, 95% CI: −0.627, −0.085]. No other significant effect was found (all P > 0.138). In addition, we did observe the N170 component (see Supplementary Figure S1). The 2(ALEX vs. NonALEX) by 2 (anger vs sadness) repeated measures ANOVA in N170 differences between Intact and Eyeless faces revealed a significant main effect of Emotion [F(1,46) = 4.380, P = 0.042, ηp2 = 0.087, 95% CI: −0.626, −0.012], with the angry eye-region eliciting more negative N170 difference waves than the sad one. No other significant effects were found (Ps > 0.502). These ERP results suggest that the N1 might be more sensitive than the N170 in capturing perceptive deficits of the eye-region in ALEX individuals.

FAA results

For FAA differences between intact and eyeless conditions, the two-way ANOVA revealed a significant interaction effect between Emotion and Group during 100–200 ms [F(1,46) = 7.635, P = 0.008, ηp2 = 0.142]. Please note that this P-value was FDR-corrected (10 comparisons). During the stage of 100–200 ms, the two-way ANOVA revealed a significant main effect of Emotion [F(1,46) = 6.846, P = 0.012, ηp2 = 0.130; 95% CI: −1.183, −0.154; anger < sadness], with sadness eliciting higher FAA than anger, confirming approach–withdrawal theory at the eye-region processing level (Harmon-Jones ). A significant interaction effect between Group and Emotion revealed that ALEX, as compared to NonALEX, showed significantly stronger FAA in the anger condition [F(1,46) = 4.302, P = 0.044, ηp2 = 0.086, 95% CI: 0.020, 1.307], and a marginally significant weaker FAA in the sadness condition [F(1,46) = 3.297, P = 0.076, ηp2 = 0.067, 95% CI: −1.578, 0.081]. Simple effect analyses for another direction of interaction effect between Group and Emotion revealed no significant difference between anger and sadness in ALEX [F(1,46) = 0.011, P = 0.916, ηp2 < 0.001, 95% CI: −0.674, 0.749], while NonALEX exhibited stronger FAA to sadness than anger [F(1,46) = 13.892, P = 0.001, ηp2 = 0.232, 95% CI: −2.117, −0.632], suggesting abnormal approach–withdrawal processing for emotive eye-regions in alexithymia. No other significant effect was found (all P > 0.874). We also replicated our results using a window size of 300 ms in time–frequency decomposition and ensured the area of alpha asymmetry (specific to the frontal area; for details, see supplementary materials).

Control analyses

To check whether the current results are robust after controlling for anxiety, depression and autism, we added BDI, BAI and AQ as covariates. For eye-region focus, there was a marginally significant interaction effect between BDI and Emotion [F(1,43) = 3.589, P = 0.065, ηp2 = 0.077], which violated the assumption of analysis of covariance that covariates should not influence the effect of interest (Miller and Chapman, 2001). We thus only added BAI and ASQ as covariates. Results showed a marginally significant interaction effect between Group and Emotion [F(1,44) = 3.332, P = 0.075, ηp2 = 0.070], revealing that there were no significant differences between anger and sadness in ALEX [F(1,44) = 0.523, ηp2 = 0.009, 95% CI: −0.031, 0.059; anger: 0.290 ± 0.020, sadness: 0.276 ± 0.021], whereas NonALEX did [F(1,44) = 5.178, P = 0.028, ηp2 = 0.105, 95% CI: −0.100, −0.006; anger: 0.277 ± 0.022, sadness: 0.330 ± 0.022]. Our N1 and FAA results were not influenced by levels of anxiety, depression and autism (for details, see Supplement).

Discussion

The current study contrasted individuals with and without alexithymia with regard to cognitive and electrophysiological mechanisms underlying eye-region processing of faces with emotional expressions. Consistent with previous studies (Bird ; Fujiwara, 2018), ALEX made less use of eye-region information in emotion processing. Alexithymic participants relied less on perceptual processing of the eye-region than non-alexithymic individuals, as could be inferred from different eye-region focuses and N1 differentiating responses (between intact and eyeless faces) to anger and sadness in NonALEX, but not in ALEX. On the other hand, reduced eye-region focus and less FAA differences between intact and eyeless faces were observed in ALEX as compared to NonALEX in response to sadness, indicative of diminished affective encoding for the sad eye-region. Taken together, these findings suggest perceptual and affective deficits in emotive eye-region processing in alexithymia. Unlike previous eye-tracking studies examining the scan path of eye-region processing in alexithymia (Bird ; Fujiwara, 2018), we used EEG to examine the perceptual and affective mechanisms of emotive eye-region processing in alexithymia. The different eye-focus patterns between anger and sadness in NonALEX, but not in ALEX, were also reflected in group differences for the N1. The N1 component, originating from the sensory system, has been associated with visual processing of facial features, with larger N1 reflecting stronger ability of attentional capture (Calvo ). Importantly, N1 differences have been demonstrated for different facial expressions (Luo ). Given different physical properties in the eye-region between angry and sad facial expressions (Ekman and Friesen, 1971; Itier and Batty, 2009), the current results for the N1 suggest difficulties in perceptual processing of the eye-region in people with alexithymia. Although the N1 results were not exactly as we expected for group differences of eye-focus either in anger or in sad, difficulties in differentiating emotive eye-regions between anger and sadness indeed supported our hypothesis of deficient perceptive processing for the emotive eye-region in alexithymia. It has been documented that expression recognition is perceptually driven, without affective encoding (Calvo ). Therefore, the difficulty to recognize emotional expressions, as typically observed in alexithymia, may be driven by perceptual deficits in the early encoding of affective stimuli (see Donges and Suslow, 2017 for a review on automatic processing of emotional information in alexithymia). As suggested by Calvo and Nummenmaa (2016), both N1 and P1 reflect perceptive processing. Regarding the P1, there was a trend of increased P1 in ALEX than NonALEX, indicative of more engagement of attentional resources in processing emotive eye-regions in individuals with alexithymia (Delle-Vigne ), however, which did not reach significance. These results suggest that the N1 might be more sensitive in capturing perceptive deficits of the eye-region in individuals with alexithymia. Relevance of the N1 in facial emotion processing was recently also observed in an ERP study of spatial processing in peripersonal space (Ellena ). In line with approach–withdrawal theory (Harmon-Jones ), sadness (withdrawal negative emotion) indeed elicited higher FAA than anger (approach negative emotion) at the eye-region processing level in individuals without alexithymia. However, alexithymic people did not show such differences, indicative of impaired approach/withdrawal affective encoding. Importantly, we observed a reduced eye-region focus for sad facial expressions in ALEX compared to NonALEX. The same pattern was found in the FAA, supporting our hypothesis of impaired affective encoding of emotive eye-region in alexithymia. Decreased FAA to withdrawal expressions (e.g. sadness) has long been considered as a correlate of decreased processing of avoidance emotions (Harmon-Jones ). The FAA is also a reliable and stable index over time, especially for patients with mood disorders (Allen ). Despite ample evidence of emotion processing using FAA from resting-state studies (for a review, see Harmon-Jones ), to the best of our knowledge, only one study validated task-related FAA (Schöne ). In their event-related design, FAA was observed for approach as compared to neutral stimuli from 500 to 1000 ms after stimuli onsets (by visual inspection method), suggesting increased FAA representing approach processing in tasks. Therefore, the current results regarding decreased task-related FAA differences between Intact and Eyeless conditions in ALEX compared to NonALEX in response to sad faces may reflect reduced withdrawal processing for the eye-region in sadness. As the FAA reflects hemispheric interaction, this may be consistent with the hypoarousal model and the theory of a right hemisphere deficit or a left hemisphere preference in alexithymia (Buchanan ; Wehmer ; Bermond ). A causal role of the FAA in the emotional response has been suggested from a study using neurofeedback training (Allen ). This may have important implications for the treatment of alexithymia-related disorders. In addition, the emotion recognition ability of an amygdala-damaged patient improved after explicitly being instructed to look at the eyes. Our recent study showed social-specific impairments of negative emotion processing in alexithymia (Wang ). Therefore, social cognition training including explicit instruction to look at the eyes in emotion recognition may be integrated into treatment for alexithymia-related disorders. Although previous resting studies averaged EEG activity across several minutes, the dynamic nature of FAA has also been deconstructed from the resting state and shown better prediction of emotion-related processing (Allen and Cohen, 2010). Compared to a previous task-related FAA study using visual inspection (Schöne ), the current data-driven method with multiple comparison correction assesses the timing of our focused effects more objectively. Many studies have used this approach in time–frequency analyses to test the timing of focused effects (Pu and Yu, 2019; Wang ). Recently, a new framework of the self to other model of empathy has been proposed to understand abnormal emotion processing in alexithymia, contending that the impairment of the affective representation system results from impaired affective learning (Bird and Viding, 2014). Lower FAA has been associated with worse aversive learning (Schmid ). Future studies may investigate this further, with a focus on the relevance of eye region processing, especially for facial expressions of sadness (Eisenbarth and Alpers, 2011). As for angry expressions, we observed increased FAA differences between intact and eyeless conditions in ALEX than NonALEX. Increased FAA to approach expressions (e.g. anger) is associated with decreased processing of approach emotions (Harmon-Jones ). Our finding of increased FAA for the angry eye-region in ALEX people may suggest less approach experience to the angry eye-region. Together, decreased FAA to sadness and increased FAA to anger in ALEX suggest both deficient approach and withdrawal affective processing in alexithymic people. Although the dominant emotion theories associated negative affect with avoidance motivation and positive affect with approach motivation (Lang, 1995), we used negative emotions with different approach/avoidance motivations to dissociate approach–withdrawal processing from the general valence processing in alexithymia. Future studies are needed to further investigate approach–withdrawal affective processing in alexithymia to reveal whether and how such deficits affect behavior in everyday life situations. Although the N170 component has been considered to be involved in affective encoding for facial expressions (Luo ), we did not observe any significant group-related effect (see Supplementary Figure S1 and S2), suggesting that the N170 might be insensitive in capturing affective deficits of the eye-region in individuals with alexithymia. Please note that the observed deficits at the behavioral level were associated with depression and should thus be interpreted with caution. Many mental disorders, such as schizophrenia (Clark ), anxiety (Keil ) and psychopathy (Gehrer , 2020), are associated with abnormalities in emotive eye-region processing, as well as neurological disorders (e.g. Huntington’s disease or epilepsy; Kordsachia ). Alexithymia, as a subclinical personality trait, is considered as a transdiagnostic risk factor for various mental disorders (Honkalampi ). Therefore, the present findings may have great value for the understanding of alexithymia-related affective disorders. Despite progress regarding the important role of alexithymia in ASD (Bird , 2011), evidence from other disorders is still lacking. Given that elevated levels of alexithymia were found in patients with various mental disorders (Frewen ; Cruise and Becerra, 2018), future studies would benefit from investigating the specific role of alexithymia in mental disorders, especially regarding difficulties in socio-affective processing. Several limitations of the present study should be mentioned. First, static facial pictures were used, whereas in real life, we recognize emotion from dynamic visual information, e.g. eye movements (Kokinous ). Future studies should thus aim to explore perceptive and affective mechanisms in alexithymia in a dynamic context for higher ecological validity. Second, we only used angry and sad stimuli in the current study. It is thus not clear whether the current conclusions would extrapolate to other basic emotions, such as fear (Mériau ). Third, we cannot attribute the current findings to negative specificity due to the lack of a neutral condition. This could be examined by adding a neutral control condition in future studies. Fourth, although the stimuli we used were matched for recognition rate and intensity between angry and sad faces, whether valence affects the current findings remains unclear because no valence ratings were available for this dataset. Fifth, all participants reported no mental illness in present or past, but no diagnostic interview was applied to rule out possibly undiagnosed conditions. Next, we could not rule out physical or morphological differences in the relevance of the eye-region at the perceptive level when comparing angry and sad facial expressions, because physical properties are inherently tied to the specified facial expression and emotive eye-regions (Bird ; Fujiwara, 2018). Finally, it has been shown that faces with disfigured features attract more fixation on the eyes and incur a higher number of recurrent fixations compared to faces with salience-matched occluding features (Boutsen ). People with alexithymia may be less sensitive to this effect. Combining measurement and analysis of scan paths (using an eye-tracker) for stimuli comprising of eyeless faces can potentially shed more light on this. To conclude, this study provides behavioral and electrophysiological evidence of abnormalities in eye-region processing of emotional expressions in individuals with clinically relevant alexithymia levels. Inspired by the two-stage model of facial expression processing (Calvo and Nummenmaa, 2016), our results suggest both perceptual and affective deficits for eye-region processing in alexithymia. These findings may have important implications for the understanding and ultimately the treatment of alexithymia-related affective disorders. Click here for additional data file.
  76 in total

1.  Event-related frontal alpha asymmetries: electrophysiological correlates of approach motivation.

Authors:  Benjamin Schöne; Jessica Schomberg; Thomas Gruber; Markus Quirin
Journal:  Exp Brain Res       Date:  2015-11-04       Impact factor: 1.972

2.  Looking at the eyes interferes with facial emotion recognition in alexithymia.

Authors:  Esther Fujiwara
Journal:  J Abnorm Psychol       Date:  2018-05-24

Review 3.  The role of asymmetric frontal cortical activity in emotion-related phenomena: a review and update.

Authors:  Eddie Harmon-Jones; Philip A Gable; Carly K Peterson
Journal:  Biol Psychol       Date:  2009-09-04       Impact factor: 3.251

4.  Three stages of emotional word processing: an ERP study with rapid serial visual presentation.

Authors:  Dandan Zhang; Weiqi He; Ting Wang; Wenbo Luo; Xiangru Zhu; Ruolei Gu; Hong Li; Yue-Jia Luo
Journal:  Soc Cogn Affect Neurosci       Date:  2014-02-12       Impact factor: 3.436

5.  Eye contact during live social interaction in incarcerated psychopathic offenders.

Authors:  Nina A Gehrer; Andrew T Duchowski; Aiste Jusyte; Michael Schönenberg
Journal:  Personal Disord       Date:  2020-03-12

6.  Species sensitivity of early face and eye processing.

Authors:  Roxane J Itier; Patricia Van Roon; Claude Alain
Journal:  Neuroimage       Date:  2010-08-01       Impact factor: 6.556

7.  Why do alexithymic features appear to be stable? A 12-month follow-up study of a general population.

Authors:  K Honkalampi; H Koivumaa-Honkanen; A Tanskanen; J Hintikka; J Lehtonen; H Viinamäki
Journal:  Psychother Psychosom       Date:  2001 Sep-Oct       Impact factor: 17.659

Review 8.  Alexithymia: concept, measurement, and implications for treatment.

Authors:  G J Taylor
Journal:  Am J Psychiatry       Date:  1984-06       Impact factor: 18.112

9.  The stability of resting frontal electroencephalographic asymmetry in depression.

Authors:  John J B Allen; Heather L Urry; Sabrina K Hitt; James A Coan
Journal:  Psychophysiology       Date:  2004-03       Impact factor: 4.016

10.  Deconstructing the "resting" state: exploring the temporal dynamics of frontal alpha asymmetry as an endophenotype for depression.

Authors:  John J B Allen; Michael X Cohen
Journal:  Front Hum Neurosci       Date:  2010-12-29       Impact factor: 3.169

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.