| Literature DB >> 26578938 |
Huiyan Lin1, Claudia Schulz2, Thomas Straube1.
Abstract
Expectancy congruency has been shown to modulate event-related potentials (ERPs) to emotional stimuli, such as facial expressions. However, it is unknown whether the congruency ERP effects to facial expressions can be modulated by cognitive manipulations during stimulus expectation. To this end, electroencephalography (EEG) was recorded while participants viewed (neutral and fearful) facial expressions. Each trial started with a cue, predicting a facial expression, followed by an expectancy interval without any cues and subsequently the face. In half of the trials, participants had to solve a cognitive task in which different letters were presented for target letter detection during the expectancy interval. Furthermore, facial expressions were congruent with the cues in 75% of all trials. ERP results revealed that for fearful faces, the cognitive task during expectation altered the congruency effect in N170 amplitude; congruent compared to incongruent fearful faces evoked larger N170 in the non-task condition but the congruency effect was not evident in the task condition. Regardless of facial expression, the congruency effect was generally altered by the cognitive task during expectation in P3 amplitude; the amplitudes were larger for incongruent compared to congruent faces in the non-task condition but the congruency effect was not shown in the task condition. The findings indicate that cognitive tasks during expectation reduce the processing of expectation and subsequently, alter congruency ERP effects to facial expressions.Entities:
Keywords: ERPs; N170; P3; cognitive tasks during expectation; emotional congruency; facial expression
Year: 2015 PMID: 26578938 PMCID: PMC4623202 DOI: 10.3389/fnhum.2015.00596
Source DB: PubMed Journal: Front Hum Neurosci ISSN: 1662-5161 Impact factor: 3.169
Figure 1Experimental procedure. A cue-face paradigm was performed with a cognitive task between the cue and the face (the upper panel) and without (the lower panel).
.
| Congruent | 108.41 | 87 | 120 | 107.77 | 84 | 119 | 106.82 | 87 | 120 | 107.95 | 88 | 119 |
| Incongruent | 35.82 | 28 | 40 | 36.45 | 31 | 40 | 35.77 | 29 | 40 | 36.73 | 30 | 40 |
Figure 2ACC (the left panel) and RTs (the right panel) for recognition of facial expressions for each experimental condition. Vertical lines indicate the standard error of the mean.
.
| Congruent | 0.96 | 0.01 | 0.93 | 0.01 | 0.92 | 0.01 | 0.89 | 0.01 |
| Incongruent | 0.94 | 0.01 | 0.92 | 0.01 | 0.85 | 0.02 | 0.89 | 0.01 |
.
| Congruent | 548.81 | 12.98 | 588.19 | 9.48 | 556.39 | 14.97 | 596.16 | 12.26 |
| Incongruent | 569.53 | 15.02 | 605.83 | 11.94 | 581.15 | 13.73 | 607.80 | 12.21 |
Figure 3ERPs at parietal- occipital electrodes (PO9 and PO10) for all the experimental conditions. Shaded areas correspond to the analysis window for the N170 (130–180 ms).
Figure 4ERPs at the parietal electrodes (P3, Pz, and P4) for all the experimental conditions. Shaded areas correspond to the analysis window for the P3 (450–650 ms).
Figure 5Topographical maps based on mean amplitudes of N170 (130–180 ms) and P3 (450–650 ms) for all experimental conditions.
.
| PO9 | Congruent | −2.92 | 0.63 | −2.69 | 0.68 | −4.07 | 0.65 | −3.07 | 0.73 |
| Incongruent | −3.00 | 0.72 | −2.56 | 0.80 | −3.29 | 0.66 | −3.04 | 0.76 | |
| PO10 | Congruent | −4.09 | 0.81 | −3.56 | 0.73 | −5.06 | 0.80 | −4.24 | 0.75 |
| Incongruent | −4.10 | 0.83 | −3.57 | 0.74 | −4.45 | 0.82 | −4.20 | 0.78 | |
.
| P3 | Congruent | 5.55 | 0.53 | 4.96 | 0.65 | 6.05 | 0.53 | 5.32 | 0.59 |
| Incongruent | 6.27 | 0.59 | 4.55 | 0.63 | 6.37 | 0.52 | 5.39 | 0.58 | |
| Pz | Congruent | 8.40 | 0.88 | 5.87 | 0.90 | 8.79 | 1.04 | 6.58 | 0.90 |
| Incongruent | 8.81 | 0.91 | 5.24 | 0.97 | 9.25 | 1.00 | 6.53 | 0.86 | |
| P4 | Congruent | 5.56 | 0.67 | 4.96 | 0.67 | 6.55 | 0.72 | 5.51 | 0.74 |
| Incongruent | 6.14 | 0.72 | 5.23 | 0.79 | 6.29 | 0.76 | 5.19 | 0.76 | |