Literature DB >> 35128550

Human-Guided Modality Informativeness for Affective States.

Torsten Wörtwein1, Lisa B Sheeber2, Nicholas Allen3, Jeffrey F Cohn4, Louis-Philippe Morency1.   

Abstract

This paper studies the hypothesis that not all modalities are always needed to predict affective states. We explore this hypothesis in the context of recognizing three affective states that have shown a relation to a future onset of depression: positive, aggressive, and dysphoric. In particular, we investigate three important modalities for face-to-face conversations: vision, language, and acoustic modality. We first perform a human study to better understand which subset of modalities people find informative, when recognizing three affective states. As a second contribution, we explore how these human annotations can guide automatic affect recognition systems to be more interpretable while not degrading their predictive performance. Our studies show that humans can reliably annotate modality informativeness. Further, we observe that guided models significantly improve interpretability, i.e., they attend to modalities similarly to how humans rate the modality informativeness, while at the same time showing a slight increase in predictive performance.

Entities:  

Keywords:  affective computing; fusion; multimodal

Year:  2021        PMID: 35128550      PMCID: PMC8812829          DOI: 10.1145/3462244.3481004

Source DB:  PubMed          Journal:  Proc ACM Int Conf Multimodal Interact


  11 in total

1.  The discrimination of speech sounds within and across phoneme boundaries.

Authors:  A M LIBERMAN; K S HARRIS; H S HOFFMAN; B C GRIFFITH
Journal:  J Exp Psychol       Date:  1957-11

2.  AFAR: A Deep Learning Based Tool for Automated Facial Affect Recognition.

Authors:  Itir Onal Ertugrul; László A Jeni; Wanqiao Ding; Jeffrey F Cohn
Journal:  Proc Int Conf Autom Face Gesture Recognit       Date:  2019-07-11

3.  Adaptive Mixtures of Local Experts.

Authors:  Robert A Jacobs; Michael I Jordan; Steven J Nowlan; Geoffrey E Hinton
Journal:  Neural Comput       Date:  1991       Impact factor: 2.026

4.  Multimodal Routing: Improving Local and Global Interpretability of Multimodal Language Analysis.

Authors:  Yao-Hung Hubert Tsai; Martin Q Ma; Muqiao Yang; Ruslan Salakhutdinov; Louis-Philippe Morency
Journal:  Proc Conf Empir Methods Nat Lang Process       Date:  2020-11

5.  Multimodal Transformer for Unaligned Multimodal Language Sequences.

Authors:  Yao-Hung Hubert Tsai; Shaojie Bai; Paul Pu Liang; J Zico Kolter; Louis-Philippe Morency; Ruslan Salakhutdinov
Journal:  Proc Conf Assoc Comput Linguist Meet       Date:  2019-07

6.  Nonverbal Social Withdrawal in Depression: Evidence from manual and automatic analysis.

Authors:  Jeffrey M Girard; Jeffrey F Cohn; Mohammad H Mahoor; S Mohammad Mavadati; Zakia Hammal; Dean P Rosenwald
Journal:  Image Vis Comput       Date:  2014-10       Impact factor: 2.818

Review 7.  Systematic review of school-based prevention and early intervention programs for depression.

Authors:  Alison L Calear; Helen Christensen
Journal:  J Adolesc       Date:  2009-07-31

8.  FERA 2017 - Addressing Head Pose in the Third Facial Expression Recognition and Analysis Challenge.

Authors:  Michel F Valstar; Enrique Sánchez-Lozano; Jeffrey F Cohn; László A Jeni; Jeffrey M Girard; Zheng Zhang; Lijun Yin; Maja Pantic
Journal:  Proc Int Conf Autom Face Gesture Recognit       Date:  2017-06-29

9.  Global, regional, and national incidence, prevalence, and years lived with disability for 354 diseases and injuries for 195 countries and territories, 1990-2017: a systematic analysis for the Global Burden of Disease Study 2017.

Authors: 
Journal:  Lancet       Date:  2018-11-08       Impact factor: 79.321

10.  Psychobiological markers of allostatic load in depressed and nondepressed mothers and their adolescent offspring.

Authors:  Benjamin W Nelson; Lisa Sheeber; Jennifer Pfeifer; Nicholas B Allen
Journal:  J Child Psychol Psychiatry       Date:  2020-05-21       Impact factor: 8.982

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.