Literature DB >> 26253158

Reliable emotion recognition system based on dynamic adaptive fusion of forehead biopotentials and physiological signals.

Mahdi Khezri1, Mohammad Firoozabadi2, Ahmad Reza Sharafat3.   

Abstract

In this study, we proposed a new adaptive method for fusing multiple emotional modalities to improve the performance of the emotion recognition system. Three-channel forehead biosignals along with peripheral physiological measurements (blood volume pressure, skin conductance, and interbeat intervals) were utilized as emotional modalities. Six basic emotions, i.e., anger, sadness, fear, disgust, happiness, and surprise were elicited by displaying preselected video clips for each of the 25 participants in the experiment; the physiological signals were collected simultaneously. In our multimodal emotion recognition system, recorded signals with the formation of several classification units identified the emotions independently. Then the results were fused using the adaptive weighted linear model to produce the final result. Each classification unit is assigned a weight that is determined dynamically by considering the performance of the units during the testing phase and the training phase results. This dynamic weighting scheme enables the emotion recognition system to adapt itself to each new user. The results showed that the suggested method outperformed conventional fusion of the features and classification units using the majority voting method. In addition, a considerable improvement, compared to the systems that used the static weighting schemes for fusing classification units, was also shown. Using support vector machine (SVM) and k-nearest neighbors (KNN) classifiers, the overall classification accuracies of 84.7% and 80% were obtained in identifying the emotions, respectively. In addition, applying the forehead or physiological signals in the proposed scheme indicates that designing a reliable emotion recognition system is feasible without the need for additional emotional modalities.
Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

Entities:  

Keywords:  Dynamic adaptive fusion of classification units; Emotion recognition system; Forehead bioelectric signals; Human computer interactions; Physiological signals

Mesh:

Year:  2015        PMID: 26253158     DOI: 10.1016/j.cmpb.2015.07.006

Source DB:  PubMed          Journal:  Comput Methods Programs Biomed        ISSN: 0169-2607            Impact factor:   5.428


  6 in total

1.  Cross-Subject EEG Feature Selection for Emotion Recognition Using Transfer Recursive Feature Elimination.

Authors:  Zhong Yin; Yongxiong Wang; Li Liu; Wei Zhang; Jianhua Zhang
Journal:  Front Neurorobot       Date:  2017-04-10       Impact factor: 2.650

2.  EEG-Based Emotion Recognition Using Quadratic Time-Frequency Distribution.

Authors:  Rami Alazrai; Rasha Homoud; Hisham Alwanni; Mohammad I Daoud
Journal:  Sensors (Basel)       Date:  2018-08-20       Impact factor: 3.576

3.  A Spherical Phase Space Partitioning Based Symbolic Time Series Analysis (SPSP-STSA) for Emotion Recognition Using EEG Signals.

Authors:  Hoda Tavakkoli; Ali Motie Nasrabadi
Journal:  Front Hum Neurosci       Date:  2022-06-29       Impact factor: 3.473

4.  Emotion Recognition Based on Weighted Fusion Strategy of Multichannel Physiological Signals.

Authors:  Wei Wei; Qingxuan Jia; Yongli Feng; Gang Chen
Journal:  Comput Intell Neurosci       Date:  2018-07-05

5.  Coverage of Emotion Recognition for Common Wearable Biosensors.

Authors:  Terence K L Hui; R Simon Sherratt
Journal:  Biosensors (Basel)       Date:  2018-03-24

6.  Can We Ditch Feature Engineering? End-to-End Deep Learning for Affect Recognition from Physiological Sensor Data.

Authors:  Maciej Dzieżyc; Martin Gjoreski; Przemysław Kazienko; Stanisław Saganowski; Matjaž Gams
Journal:  Sensors (Basel)       Date:  2020-11-16       Impact factor: 3.576

  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.