Literature DB >> 16468569

A real-time automated system for the recognition of human facial expressions.

Keith Anderson1, Peter W McOwan.   

Abstract

A fully automated, multistage system for real-time recognition of facial expression is presented. The system uses facial motion to characterize monochrome frontal views of facial expressions and is able to operate effectively in cluttered and dynamic scenes, recognizing the six emotions universally associated with unique facial expressions, namely happiness, sadness, disgust, surprise, fear, and anger. Faces are located using a spatial ratio template tracker algorithm. Optical flow of the face is subsequently determined using a real-time implementation of a robust gradient model. The expression recognition system then averages facial velocity information over identified regions of the face and cancels out rigid head motion by taking ratios of this averaged motion. The motion signatures produced are then classified using Support Vector Machines as either nonexpressive or as one of the six basic emotions. The completed system is demonstrated in two simple affective computing applications that respond in real-time to the facial expressions of the user, thereby providing the potential for improvements in the interaction between a computer user and technology.

Entities:  

Mesh:

Year:  2006        PMID: 16468569     DOI: 10.1109/tsmcb.2005.854502

Source DB:  PubMed          Journal:  IEEE Trans Syst Man Cybern B Cybern        ISSN: 1083-4419


  9 in total

1.  ReliefF-Based EEG Sensor Selection Methods for Emotion Recognition.

Authors:  Jianhai Zhang; Ming Chen; Shaokai Zhao; Sanqing Hu; Zhiguo Shi; Yu Cao
Journal:  Sensors (Basel)       Date:  2016-09-22       Impact factor: 3.576

2.  Subject-independent emotion recognition based on physiological signals: a three-stage decision method.

Authors:  Jing Chen; Bin Hu; Yue Wang; Philip Moore; Yongqiang Dai; Lei Feng; Zhijie Ding
Journal:  BMC Med Inform Decis Mak       Date:  2017-12-20       Impact factor: 2.796

3.  Emotion recognition based on EEG features in movie clips with channel selection.

Authors:  Mehmet Siraç Özerdem; Hasan Polat
Journal:  Brain Inform       Date:  2017-07-15

4.  Identifying fetal yawns based on temporal dynamics of mouth openings: A preterm neonate model using support vector machines (SVMs).

Authors:  Damiano Menin; Angela Costabile; Flaviana Tenuta; Harriet Oster; Marco Dondi
Journal:  PLoS One       Date:  2019-12-19       Impact factor: 3.240

5.  AttendAffectNet-Emotion Prediction of Movie Viewers Using Multimodal Fusion with Self-Attention.

Authors:  Ha Thi Phuong Thao; B T Balamurali; Gemma Roig; Dorien Herremans
Journal:  Sensors (Basel)       Date:  2021-12-14       Impact factor: 3.576

6.  Emotion Recognition Based on EEG Using Generative Adversarial Nets and Convolutional Neural Network.

Authors:  Bo Pan; Wei Zheng
Journal:  Comput Math Methods Med       Date:  2021-10-11       Impact factor: 2.238

Review 7.  The Application of Electroencephalogram in Driving Safety: Current Status and Future Prospects.

Authors:  Yong Peng; Qian Xu; Shuxiang Lin; Xinghua Wang; Guoliang Xiang; Shufang Huang; Honghao Zhang; Chaojie Fan
Journal:  Front Psychol       Date:  2022-07-22

8.  EEG Emotion Classification Network Based on Attention Fusion of Multi-Channel Band Features.

Authors:  Xiaoliang Zhu; Wenting Rong; Liang Zhao; Zili He; Qiaolai Yang; Junyi Sun; Gendong Liu
Journal:  Sensors (Basel)       Date:  2022-07-13       Impact factor: 3.847

9.  Global Electroencephalography Synchronization as a New Indicator for Tracking Emotional Changes of a Group of Individuals during Video Watching.

Authors:  Chang-Hee Han; Jun-Hak Lee; Jeong-Hwan Lim; Yong-Wook Kim; Chang-Hwan Im
Journal:  Front Hum Neurosci       Date:  2017-12-01       Impact factor: 3.169

  9 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.