Literature DB >> 34926739

ECSMP: A dataset on emotion, cognition, sleep, and multi-model physiological signals.

Zhilin Gao1, Xingran Cui2, Wang Wan1, Wenming Zheng2, Zhongze Gu1.   

Abstract

This paper described the collection of multi-modal physiological signals, which include electroencephalography, electrocardiograph (ECG), photoplethysmography, electrodermal activity, temperature, and accelerometer data, recorded from 89 healthy college students during resting state, the emotion induction and recovery, and a set of cognitive function assessment tasks. Emotion, sleep, cognition, depression, mood, and other factors were evaluated through different methods, and were included in this dataset. Six emotions (neutral, fear, sad, happy, anger, and disgust) were induced by movie clips. The cognitive functions such as sustained attention, response inhibition, working memory, and strategy use, were quantitatively measured by Cambridge neuropsychological test automatic battery. The sleep ECG was collected the night before the emotion-induction experiment, and the sleep quality was analysed based on the sleep ECG. After the experiment, the participants were required to fill in questionnaires to evaluate the emotion regulation strategies, depression score, recent mood, and sleep quality index. The database can not only be directly used for the research of emotion recognition on multi-modal physiological signals, but also can further explore the interactions between emotion, cognition, and sleep.
© 2021 The Authors. Published by Elsevier Inc.

Entities:  

Keywords:  Cognitive function assessment; EEG; Multi-model physiological signals; Sleep quality; Video-induced emotion

Year:  2021        PMID: 34926739      PMCID: PMC8648963          DOI: 10.1016/j.dib.2021.107660

Source DB:  PubMed          Journal:  Data Brief        ISSN: 2352-3409


Specifications Table

Value of the Data

Compared with other public emotion datasets, the physiological signals of EEG, ECG, PPG, EDA, TEMP and ACC during the process of both emotion induction (about 5 min) and emotional recovery (2 min) were recorded. This dataset also included ECG signals during sleep, cognitive ability assessment and various scale evaluation results. Researchers, who are interested in emotion recognition, the dynamic process of emotion induction and recovery, and various factors that affect emotion, can use this dataset to conduct multi-modal physiological signal research. EEG signals without lowpass filter were provided for researchers to analyse the information in high-frequency EEG signals. This dataset can be used to explore the interactions between emotion, cognition, and sleep. For example, 1) the impact of sleep quality on cognitive behaviour, 2) the impact of sleep quality on emotion induction, and 3) the association between cognitive abilities and positive/negative emotional responses.

Data Description

Physiological signals

In this dataset, physiological signals include EEG, ECG, PPG, EDA, TEMP, and ACC data. EEG signals Raw EEG signals were recorded by Neuroscan (SynAmps RT 64-channel Amplifier) with 1000Hz sampling rate throughout the whole experiment and downsampled to 250Hz because of occupying too much storage space. The left ear (A1 channel) was used as the reference and seven channels (Fp1, Fpz, Fp2, AF3, AF4, F7, F8, and A2) were recorded. There is no hardware filter used in this dataset. The EEG data of all subjects were located in the EEG_downsample folder. EEG data contained original signal, electrode information, sampling rate, and event marks and was saved in mat format which can be read in Matlab with the following command: ≫ EEG = load(‘filename’); ECG signals ECG recordings were collected by a CFDA (China Food and Drug Administration) approved ambulatory electrocardiogram monitor (AECG-100A, Fengsheng Yongkang Software Co, Nanjing, China) with a computer-based data-acquisition system and a sleep quality analysis software. The ECG recording equipment was a single-lead Holter device that can record ECG for over 24 h. The ECG acquisition device was worn on the chest. The ECG files recorded during the experiments were stored in the ECG_experiment folder, while the ECG files during the sleep period were stored in the ECG_sleep folder. ECG signals were recorded with 512Hz sampling rate and 12bit resolution during the experiment and sleep period (the night before the experiment), and saved in .bin format which can be read in Matlab with the following command: ≫ [start_time stop_time fs ECG]=readbindata(‘filepath’,'filename'); PPG, EDA, TEMP, and ACC signals The PPG, EDA, TEMP, and ACC signals were recorded using the E4 system (Empatica, Milan, Italy) [1]. It is a wrist-worn wireless device sized 44  ×  40  ×  16 mm that weighs 23 g. The E4 wristband was worn on the left wrist because the right hand was used to operate the mouse or keyboard. The physiological signals recorded by E4 of all subjects were saved in the E4 folder. Each physiological signal was described in detail as follows: The PPG signals were recorded by using two green and two red LEDs to record PPG from the dorsal wrist (sampling frequency: 64 Hz, resolution: .9 nW/digit). The RR series (in seconds) and the average heart rate with a sampling rate of 1Hz (in beats/minute) were extracted from the PPG signals. The PPG signal was saved in the BVP.cvs file, the RR series and average heart rate were saved in the IBI.csv and HR.csv files. The EDA signals (sampling frequency: 4 Hz, resolution: 1 digit ∼ 900.picoSiemens) were recorded from the volar surface of the wrist using two stainless steel electrodes sized 8mm in diameter, and saved in the EDA.csv file. The TEMP signals (sampling frequency: 4 Hz, resolution: .02°C) were recorded by an optical infrared thermopile, and saved in the TEMP.csv file. The ACC signals (sampling frequency: 32 Hz, resolution: 8 bits) were recorded using a three-axis accelerometer that measures acceleration on x, y, and z axes within the ± 2g range, and saved in the ACC.csv file. The physiological signals were saved in .csv format and can be read in Matlab with the following command: ≫ [time,ppg_fs,hr_fs,acc_fs,eda_fs,temp_fs,ppg,ibi,hr,acc,eda,temp,tags]=readE4data (‘filepath');

Sleep quality analysis results

The sleep quality was analysed on the sleep ECG recorded the night before the experiment, and saved in the sleep quality.xlsx excel file. The sleep quality was analysed based on a cardiopulmonary coupling analysis algorithm proposed in [2]. The basic information and sleep quality parameters of each subject were shown in the table. The sleep quality parameters contained the following indices: signal quality: if the signal quality is unqualified, the sleep quality analysis cannot be performed; time of data collection: the total time for the device to collect ECG data, in hours; time in bed: the total time of the subject in bed, in hours; sleep duration: the total time of sleep, in hours; deep sleep: the total time of deep sleep, in hours; light sleep: the total time of light sleep, in hours; REM: the total time of rapid eye movement (REM), in hours; wake: the total time of awake, in hours; deep sleep onset: the length of time that it takes to fall deep sleep for the first time, in minutes. sleep efficiency: the ratio of sleep duration and time in bed; apnea index: average number of apnea occurrences per hour; apnea type: the proportion of central apnea and obstructive apnea

Cognitive tests

The cognitive tests were performed using CANTAB [3]. The detailed and summary results of the cognitive tests, including motor screening (MOT), rapid visual information processing (RVP), stop signal task (SST), and spatial working memory (SWM), were saved in DetailedDatasheet.csv and SummaryDatasheet.csv which were stored in the Cantab folder. The detailed datasheet recorded trial-by-trial results. The summary datasheet was calculated from the detailed datasheet for further study. For different tasks, the following statistical results were analysed: MOT: mean latency, median latency, mean error; RVP: sensitivity index (RVP A’), probability of hit, total false alarms, mean latency, median latency; SST: stop signal reaction time of last half, mean correct reaction on go trials, median correct reaction on go trials, direction errors on stop and go trials, proportion of successful stops (last half), stop signal delay of last half; SWM: mean time between revisit errors, strategy (the number of times the subject begins a new search with a different box). The comparison results of the summary datasheet and the norm-referenced were displayed in the report in .html format.

Questionnaires

The questionnaires scores containing emotion regulation questionnaire (ERQ) [4], self-rating depression scale (SDS) [5], profile of mood states questionnaire - 40 items (POMS) [6], pittsburgh sleep quality index (PSQI) [7], and current emotional self-assessment form (CESAF) were displayed in the scale.xlsx. The expressive suppression score and cognitive reappraisal score of ERQ were calculated to evaluate the tendency of emotion regulation strategies. The final SDS score was the sum of all the scores *1.25. In the POMS sheet, the tension, anger, fatigue, depression, vigor, confusion, esteem, and total mood disturbance (TMD) score were all computed. The level of sleep quality, sleep latency, sleep duration, sleep efficiency, sleep disorders, hypnotic, daytime dysfunction, and PSQI were calculated and shown in the PSQI sheet. CESAF was used to evaluate the six-dimensional emotional state (neutral, fear, sad, happy, anger, and disgust) after watching video clips and the fatigue level after completing cognitive assessment task.

Subjects’ information

The basic information of all 89 subjects was recorded in the experiment subjects.xlsx excel file. In the experiment sheet, the basic information included ID, sex, age, completion state, experiment date, experiment sequence, order of stimuli videos, and remarks. In the sleep sheet, the basic information included ID, sex, age, completion state, time to fall asleep, wake time, and remarks.

Experimental Design, Materials and Methods

In this dataset, there were two stages: sleep stage and experiment stage. A total of 89 healthy college students (32 males and 57 females, age: 23.68 ± 2.12 years) from Southeast University participated in the study and signed an informed consent form. Among all subjects, ECG of 73 healthy subjects (26 males and 47 females, age: 23.53 ± 2.06 years) during sleep period were collected in this dataset. The sleep stage was the night before the experimental stage, but 7 of the subjects collected the sleep ECG after the experimental stage due to personal reasons. The day before the experiment stage, the subjects were asked to pick up the portable ECG device and learn how to wear and operate the device. All participants were instructed to wear the ECG device before going to bed and remove it after waking up in the morning. In the experiment stage, participants were instructed to complete video-watching task and cognitive assessment task. The order of the two tasks was random. The experiment was carried out in a quiet and dimly lit room. The participants were seated 0.5 meters away from the screen. Before the video-watching task and cognitive assessment task, the 3 min resting state with eyes-closed and 3 min resting state with eyes-open were recorded as the baseline. During the baseline period, participants were required to stare at the white cross on the black screen. During the cognitive assessment task, participants completed the MOT, RVP, SST, and SWM tests using CANTAB on surface pro 4. The detailed descriptions of these tests were shown below: MOT is a training procedure designed to relax the subject and to introduce them to the computer and touch screen. Participants were required to touch a series of crossword lines that flashed at different points on the screen. RVP is sensitive to dysfunction in the parietal and frontal lobe areas of the brain and is also a sensitive measure of visual sustained attention. There is a white box in the center of the screen, the single digits appear at the rate 100 digits per minute. When the participants monitor the target sequences, i.e., 2,4,6 or 4,6,8 or 3,5,7, they respond by touching the blue button. Nine target sequences are presented per minute. SST is a classic stop signal response inhibition test, which uses staircase functions to generate an estimate of stop signal reaction time. When the arrow appears on the screen, the participants need to press the corresponding button in the direction of the arrow. An auditory sound represents as a stop signal. If an auditory sound and an arrow appear at the same time, the participants must suppress their response. SWM is a test of the subject's ability to retain spatial information and to manipulate remembered items in working memory. Participants open the coloured boxes by touching them, and they aim to find the blue marks. The difficulty of the task increases as the number of boxes increases. Touching any box in which a blue token has already been found is an error. Once the cognitive assessment task was completed, participants were asked to fill self-assess scale of fatigue level. The scoring range 1 to 6 means the degree of fatigue, 1 means not tired at all, and 6 means very tired. During the video-watching task, the participants watched six different movie clips to induce specific emotions [8], and each movie clip was about 5 min. The six clips were ‘World Heritage In China’ (a documentary), ‘The Conjuring’ (a horror movie), ‘Nuan Chun’ (a touching movie), ‘Top Funny Comedian’ (a situation comedy), ‘Never Talk to Strangers’ (a domestic violence television series), and ‘The Fly’ (a classic science fiction horror movie), these videos were used to induce neutral, fear, sad, happy, anger, and disgust, respectively. The videos’ playback sequence was random for each participant. After watching each emotional video, the participants were required to fill in the CESAF which was used to describe their current feeling after watching each video (about 30 s), and then to take a 90 s rest with eyes closed to make sure the participants could recover from the previous induced emotion. The CESAF is scored from six emotional perspectives including neutral, fear, sad, happy, anger, and disgust on a scale of 1 to 7 for assessing the arousal of each perspective. The higher the score, the stronger the arousal of this emotion. All participants wore electrodes in the prefrontal and ears to record scalp EEG, portable ECG monitor on the chest to collect single lead ECG, and E4 in the wrist to acquire PPG, ACC, EDA, and TEMP signals during the whole experiment. Among all subjects, scalp EEG of 87 healthy college students (31 males and 56 females, age: 23.59 ± 2.02 years), ECG of 83 subjects (29 males and 54 females, age: 23.82 ± 1.93 years), and E4 data of 67 participants (24 males and 43 females, age: 23.82 ± 1.93 years) were collected and saved in this dataset. While completing all tasks, 88 subjects (31 males and 57 females, age: 23.69 ± 2.14 years) filled in scales including ERQ, SDS, POMS, and PSQI.

Ethics Statement

All participants were informed about the experimental protocol and matters needing attention, then signed the informed consent before the experiment. This study was approved by IEC for Clinical Research of Zhongda Hospital, affiliated to Southeast University (No. 2019ZDSYLL073-P01).

CRediT authorship contribution statement

Zhilin Gao: Conceptualization, Methodology, Software, Data curation, Writing – original draft, Writing – review & editing. Xingran Cui: Conceptualization, Validation, Writing – review & editing, Resources, Supervision, Project administration. Wang Wan: Investigation, Data curation. Wenming Zheng: Writing – review & editing. Zhongze Gu: Supervision, Project administration.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
SubjectBiomedical Engineering
Specific subject areaEmotion recognition, Physiological signal processing, Multichannel EEG acquisition, Sleep ECG, Cognitive assessment
Type of dataRaw data of electroencephalography (EEG), electrocardiograph (ECG), photoplethysmography (PPG), electrodermal activity (EDA), temperature (TEMP), accelerometer (ACC), and tables.
How the data were acquiredEEG signals were recorded by Neuroscan Synamps RT.ECG signals were recorded by ECG recorder AECG-100A.PPG, EDA, TEMP, and ACC data were recorded by wristband Empatica E4.The table of sleep quality was the analysis results based on sleep ECG.The tables of scales were the self-report of the questionnaires.The tables of cognition assessments were measured by cambridge neuropsychological test automatic battery (CANTAB).
Data formatRawPreprocessedAnalysed
Parameters for data collectionThe experiment was carried out in a quiet and dimly lit room.The EEG signals were monitored with 1000Hz sampling rate (downsampled to 250Hz). Seven channels (FP1, FZ, FP2, AF3, AF4, F7, and F8) of the forehead were recorded.The valid ECG signals were collected with 512Hz sampling rate.The sampling rate of PPG, EDA, TEMP, and ACC signals were 64Hz, 4Hz, 4Hz, and 32Hz respectively.
Description of data collectionThe experiment was divided into two parts, emotional stimulation and cognitive testing in random order. Multi-modal physiological signals were recorded during the experiment. The subjects were required to collect sleep ECG data the night before the experiment and fill in the questionnaires after the experiment.
Data source locationSoutheast University, Nanjing, 210096, China.
Data accessibilityData is hosted on a public repositoryRepository name: Mendeley DataData identification number: http://dx.doi.org/10.17632/vn5nknh3mn.2Direct URL to data: https://data.mendeley.com/datasets/vn5nknh3mn/2
Related research articleZhilin Gao, Xingran Cui, Wang Wan, Wenming Zheng, Zhongze Gu, Long-range Correlation analysis of High Frequency Prefrontal Electroencephalogram Oscillations for Dynamic Emotion Recognition, Biomedical Signal Processing and Control.
  4 in total

1.  Individual differences in two emotion regulation processes: implications for affect, relationships, and well-being.

Authors:  James J Gross; Oliver P John
Journal:  J Pers Soc Psychol       Date:  2003-08

2.  An electrocardiogram-based technique to assess cardiopulmonary coupling during sleep.

Authors:  Robert Joseph Thomas; Joseph E Mietus; Chung-Kang Peng; Ary L Goldberger
Journal:  Sleep       Date:  2005-09       Impact factor: 5.849

3.  The Pittsburgh Sleep Quality Index: a new instrument for psychiatric practice and research.

Authors:  D J Buysse; C F Reynolds; T H Monk; S R Berman; D J Kupfer
Journal:  Psychiatry Res       Date:  1989-05       Impact factor: 3.222

4.  Application of Permutation Entropy and Permutation Min-Entropy in Multiple Emotional States Analysis of RRI Time Series.

Authors:  Yirong Xia; Licai Yang; Luciano Zunino; Hongyu Shi; Yuan Zhuang; Chengyu Liu
Journal:  Entropy (Basel)       Date:  2018-02-26       Impact factor: 2.524

  4 in total
  1 in total

1.  Wearable Photoplethysmography for Cardiovascular Monitoring.

Authors:  Peter H Charlton; Panicos A Kyriaco; Jonathan Mant; Vaidotas Marozas; Phil Chowienczyk; Jordi Alastruey
Journal:  Proc IEEE Inst Electr Electron Eng       Date:  2022-03-11       Impact factor: 10.961

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.