Literature DB >> 25523041

Towards an intelligent framework for multimodal affective data analysis.

Soujanya Poria1, Erik Cambria2, Amir Hussain3, Guang-Bin Huang4.   

Abstract

An increasingly large amount of multimodal content is posted on social media websites such as YouTube and Facebook everyday. In order to cope with the growth of such so much multimodal data, there is an urgent need to develop an intelligent multi-modal analysis framework that can effectively extract information from multiple modalities. In this paper, we propose a novel multimodal information extraction agent, which infers and aggregates the semantic and affective information associated with user-generated multimodal data in contexts such as e-learning, e-health, automatic video content tagging and human-computer interaction. In particular, the developed intelligent agent adopts an ensemble feature extraction approach by exploiting the joint use of tri-modal (text, audio and video) features to enhance the multimodal information extraction process. In preliminary experiments using the eNTERFACE dataset, our proposed multi-modal system is shown to achieve an accuracy of 87.95%, outperforming the best state-of-the-art system by more than 10%, or in relative terms, a 56% reduction in error rate.
Copyright © 2014 Elsevier Ltd. All rights reserved.

Entities:  

Keywords:  Affective computing; Emotion analysis; Facial expressions; Multimodal; Multimodal sentiment analysis; Speech; Text

Mesh:

Year:  2014        PMID: 25523041     DOI: 10.1016/j.neunet.2014.10.005

Source DB:  PubMed          Journal:  Neural Netw        ISSN: 0893-6080


  8 in total

1.  A Multimodal Deep Log-Based User Experience (UX) Platform for UX Evaluation.

Authors:  Jamil Hussain; Wajahat Ali Khan; Taeho Hur; Hafiz Syed Muhammad Bilal; Jaehun Bang; Anees Ul Hassan; Muhammad Afzal; Sungyoung Lee
Journal:  Sensors (Basel)       Date:  2018-05-18       Impact factor: 3.576

2.  A Predictive Multimodal Framework to Alert Caregivers of Problem Behaviors for Children with ASD (PreMAC).

Authors:  Zhaobo K Zheng; John E Staubitz; Amy S Weitlauf; Johanna Staubitz; Marney Pollack; Lauren Shibley; Michelle Hopton; William Martin; Amy Swanson; Pablo Juárez; Zachary E Warren; Nilanjan Sarkar
Journal:  Sensors (Basel)       Date:  2021-01-07       Impact factor: 3.576

3.  An Integrated  Approach for Cancer Survival Prediction Using Data Mining Techniques.

Authors:  Ishleen Kaur; M N Doja; Tanvir Ahmad; Musheer Ahmad; Amir Hussain; Ahmed Nadeem; Ahmed A Abd El-Latif
Journal:  Comput Intell Neurosci       Date:  2021-12-28

4.  AttendAffectNet-Emotion Prediction of Movie Viewers Using Multimodal Fusion with Self-Attention.

Authors:  Ha Thi Phuong Thao; B T Balamurali; Gemma Roig; Dorien Herremans
Journal:  Sensors (Basel)       Date:  2021-12-14       Impact factor: 3.576

5.  Analysis of different affective state multimodal recognition approaches with missing data-oriented to virtual learning environments.

Authors:  Camilo Salazar; Edwin Montoya-Múnera; Jose Aguilar
Journal:  Heliyon       Date:  2021-06-16

6.  Leveraging Social Computing for Personalized Crisis Communication using Social Media.

Authors:  Dmitry Leykin; Limor Aharonson-Daniel; Mooli Lahad
Journal:  PLoS Curr       Date:  2016-03-24

7.  Role of Muscle Synergies in Real-Time Classification of Upper Limb Motions using Extreme Learning Machines.

Authors:  Chris Wilson Antuvan; Federica Bisio; Francesca Marini; Shih-Cheng Yen; Erik Cambria; Lorenzo Masia
Journal:  J Neuroeng Rehabil       Date:  2016-08-15       Impact factor: 4.262

8.  FusionSense: Emotion Classification Using Feature Fusion of Multimodal Data and Deep Learning in a Brain-Inspired Spiking Neural Network.

Authors:  Clarence Tan; Gerardo Ceballos; Nikola Kasabov; Narayan Puthanmadam Subramaniyam
Journal:  Sensors (Basel)       Date:  2020-09-17       Impact factor: 3.576

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.