Literature DB >> 25571125

Multimodal emotion recognition using EEG and eye tracking data.

Wei-Long Zheng, Bo-Nan Dong, Bao-Liang Lu.   

Abstract

This paper presents a new emotion recognition method which combines electroencephalograph (EEG) signals and pupillary response collected from eye tracker. We select 15 emotional film clips of 3 categories (positive, neutral and negative). The EEG signals and eye tracking data of five participants are recorded, simultaneously, while watching these videos. We extract emotion-relevant features from EEG signals and eye tracing data of 12 experiments and build a fusion model to improve the performance of emotion recognition. The best average accuracies based on EEG signals and eye tracking data are 71.77% and 58.90%, respectively. We also achieve average accuracies of 73.59% and 72.98% for feature level fusion strategy and decision level fusion strategy, respectively. These results show that both feature level fusion and decision level fusion combining EEG signals and eye tracking data can improve the performance of emotion recognition model.

Entities:  

Mesh:

Year:  2014        PMID: 25571125     DOI: 10.1109/EMBC.2014.6944757

Source DB:  PubMed          Journal:  Conf Proc IEEE Eng Med Biol Soc        ISSN: 1557-170X


  9 in total

1.  Electrocardiogram-Based Machine Learning Emulator Model for Predicting Novel Echocardiography-Derived Phenogroups for Cardiac Risk-Stratification: A Prospective Multicenter Cohort Study.

Authors:  Heenaben B Patel; Naveena Yanamala; Brijesh Patel; Sameer Raina; Peter D Farjo; Srinidhi Sunkara; Márton Tokodi; Nobuyuki Kagiyama; Grace Casaclang-Verzosa; Partho P Sengupta
Journal:  J Patient Cent Res Rev       Date:  2022-04-18

2.  The Effect of Time Window Length on EEG-Based Emotion Recognition.

Authors:  Delin Ouyang; Yufei Yuan; Guofa Li; Zizheng Guo
Journal:  Sensors (Basel)       Date:  2022-06-30       Impact factor: 3.847

3.  Classifying oscillatory brain activity associated with Indian Rasas using network metrics.

Authors:  Pankaj Pandey; Richa Tripathi; Krishna Prasad Miyapuram
Journal:  Brain Inform       Date:  2022-07-15

4.  A Multimodal Deep Log-Based User Experience (UX) Platform for UX Evaluation.

Authors:  Jamil Hussain; Wajahat Ali Khan; Taeho Hur; Hafiz Syed Muhammad Bilal; Jaehun Bang; Anees Ul Hassan; Muhammad Afzal; Sungyoung Lee
Journal:  Sensors (Basel)       Date:  2018-05-18       Impact factor: 3.576

Review 5.  A Review of Emotion Recognition Using Physiological Signals.

Authors:  Lin Shu; Jinyan Xie; Mingyue Yang; Ziyi Li; Zhenqi Li; Dan Liao; Xiangmin Xu; Xinyi Yang
Journal:  Sensors (Basel)       Date:  2018-06-28       Impact factor: 3.576

6.  EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data.

Authors:  Yuki Matsuda; Dmitrii Fedotov; Yuta Takahashi; Yutaka Arakawa; Keiichi Yasumoto; Wolfgang Minker
Journal:  Sensors (Basel)       Date:  2018-11-15       Impact factor: 3.576

7.  Multimodal Feature Fusion Method for Unbalanced Sample Data in Social Network Public Opinion.

Authors:  Jian Zhao; Wenhua Dong; Lijuan Shi; Wenqian Qiang; Zhejun Kuang; Dawei Xu; Tianbo An
Journal:  Sensors (Basel)       Date:  2022-07-25       Impact factor: 3.847

Review 8.  The Application of Electroencephalogram in Driving Safety: Current Status and Future Prospects.

Authors:  Yong Peng; Qian Xu; Shuxiang Lin; Xinghua Wang; Guoliang Xiang; Shufang Huang; Honghao Zhang; Chaojie Fan
Journal:  Front Psychol       Date:  2022-07-22

9.  Eye-Tracking Analysis for Emotion Recognition.

Authors:  Paweł Tarnowski; Marcin Kołodziej; Andrzej Majkowski; Remigiusz Jan Rak
Journal:  Comput Intell Neurosci       Date:  2020-08-27
  9 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.