| Literature DB >> 27471545 |
Bo Yu1, Lin Ma2, Haifeng Li2, Lun Zhao3, Hongjian Bo2, Xunda Wang2.
Abstract
Estimation of human emotions from Electroencephalogram (EEG) signals plays a vital role in affective Brain Computer Interface (BCI). The present study investigated the different event-related synchronization (ERS) and event-related desynchronization (ERD) of typical brain oscillations in processing Facial Expressions under nonattentional condition. The results show that the lower-frequency bands are mainly used to update Facial Expressions and distinguish the deviant stimuli from the standard ones, whereas the higher-frequency bands are relevant to automatically processing different Facial Expressions. Accordingly, we set up the relations between each brain oscillation and processing unattended Facial Expressions by the measures of ERD and ERS. This research first reveals the contributions of each frequency band for comprehension of Facial Expressions in preattentive stage. It also evidences that participants have emotional experience under nonattentional condition. Therefore, the user's emotional state under nonattentional condition can be recognized in real time by the ERD/ERS computation indexes of different frequency bands of brain oscillations, which can be used in affective BCI to provide the user with more natural and friendly ways.Entities:
Mesh:
Year: 2016 PMID: 27471545 PMCID: PMC4947680 DOI: 10.1155/2016/8958750
Source DB: PubMed Journal: Comput Math Methods Med ISSN: 1748-670X Impact factor: 2.238
Figure 1Samples of schematic faces with sad, happy, and neutral expressions.
Figure 2Examples of the sequences.
Figure 3Electrode Sites in different Brain Areas.
ERD/ERS computation indexes results in each frequency band, containing main effects and interaction effects in each time interval (ms).
| Frequency band | Interval (ms) | Main effect | Interaction effect |
|---|---|---|---|
| Delta (1–4 Hz) | 0~100 | Stimulus Type in the left frontal | Facial Expression × Brain Area |
| 100~200 | Stimulus Type in the right central and left parietal-occipital | Facial Expression × Brain Area | |
| 200~300 | Stimulus Type | Facial Expression × Stimulus Type × Brain Area | |
| 300~400 | Stimulus Type | Facial Expression × Stimulus Type × Brain Area | |
|
| |||
| Theta (4–8 Hz) | 0~100 | Stimulus Type | |
| 100~200 | Stimulus Type | ||
| 200~300 | Stimulus Type | ||
| 300~400 | Stimulus Type | ||
|
| |||
| Alpha 1 (8–11 Hz) | 0~100 | Brain Area | |
| 100~250 | Brain Area | ||
| 250~400 | Brain Area | ||
|
| |||
| Alpha 2 (11–13 Hz) | 0~100 | Facial Expression | |
| 100~250 | Brain Area | Facial Expression × Stimulus Type in the right parietal-occipital | |
| 250~400 | Brain Area × Hemisphere | ||
|
| |||
| Beta 1 (13–20 Hz) | 50~150 | ||
| 150~250 | Facial Expression in the right central and the left (right) parietal-occipital | Facial Expression × Brain Area | |
| 250~300 | Stimulus Type in the left central | Facial Expression × Stimulus | |
| 300~350 | Facial Expression in the left parietal-occipital | Facial Expression × Stimulus Type × Brain Area | |
| 350~400 | Facial Expression | Facial Expression × Stimulus Type × Brain Area | |
|
| |||
| Beta 2 (20–30 Hz) | 50~150 | Facial Expression in the left frontal | |
| 150~250 | |||
| 250~300 | Facial Expression × Stimulus Type × Brain Area | ||
| 300~350 | Facial Expression in the left frontal (central) | Facial Expression × Stimulus Type | |
| 350~400 | Stimulus Type in the left central | Facial Expression × Stimulus Type in the left central | |
Figure 4ERD/ERS in delta frequency band.
Figure 5ERD/ERS in theta frequency band.
Figure 6ERD/ERS in alpha 1 frequency band.
Figure 7ERD/ERS in alpha 2 frequency band.
Figure 8ERD/ERS in beta 1 frequency band.
Figure 9ERD/ERS in beta 2 frequency band.