| Literature DB >> 29389845 |
Dhwani Mehta1, Mohammad Faridul Haque Siddiqui2, Ahmad Y Javaid3.
Abstract
Extensive possibilities of applications have made emotion recognition ineluctable and challenging in the field of computer science. The use of non-verbal cues such as gestures, body movement, and facial expressions convey the feeling and the feedback to the user. This discipline of Human-Computer Interaction places reliance on the algorithmic robustness and the sensitivity of the sensor to ameliorate the recognition. Sensors play a significant role in accurate detection by providing a very high-quality input, hence increasing the efficiency and the reliability of the system. Automatic recognition of human emotions would help in teaching social intelligence in the machines. This paper presents a brief study of the various approaches and the techniques of emotion recognition. The survey covers a succinct review of the databases that are considered as data sets for algorithms detecting the emotions by facial expressions. Later, mixed reality device Microsoft HoloLens (MHL) is introduced for observing emotion recognition in Augmented Reality (AR). A brief introduction of its sensors, their application in emotion recognition and some preliminary results of emotion recognition using MHL are presented. The paper then concludes by comparing results of emotion recognition by the MHL and a regular webcam.Entities:
Keywords: Microsoft HoloLens; affect; augmented reality; emotion recognition; facial expressions; human–computer interaction; intelligence; sensors
Mesh:
Year: 2018 PMID: 29389845 PMCID: PMC5856132 DOI: 10.3390/s18020416
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Face detection and emotion recognition using machine learning.
Figure 2Face detection and emotion recognition using geometric feature-based process.
Figure 3Popular mixed-reality device (MRD): Microsoft HoloLens (MHL).
Comparison of the accuracies achieved through various techniques.
| Author Name | Technique Used | Database Used | Emotion Recognition | Emotions Considered | Drawbacks |
|---|---|---|---|---|---|
| Mutsugu [ | Convolution neural | Still images own | 97.6%—CNN | Happy, Neutral and | System insensitive to |
| Zhang [ | Patched based 3D Gabor | JAFFE C-K | 92.93%—JAFFE | Happy, Neutral, Sadness, | JAFFE DB requires |
| Hayat [ | SVM with clustering | BU 4DFE | 94.34% | Anger, Disgust, | —— |
| Hablani [ | Local binary patterns | JAFFE | Person dependent— | Happy, Neutral, Sadness, | Manual detection of |
| Zisheng [ | PHOG (Pyramid | C-K | 96.33 % | Happy, Neutral, Sadness, | —— |
| Lee [ | Sparse Representation | JAFFE BU 3DFE | Person dependent - | Happiness, Disgust, | The face images used |
| Zheng [ | Group sparse reduced-rank | BU 3DFE | 66.0% | Happiness, Fear, | Implementation of new |
| Yu [ | Deep CNN, 7 hidden layers | SFEW | 61.29% | Happiness, Disgust, | Less accuracy through |
| Dornaika [ | PCA + LDA | CMU | Above 90% | Happy, Neutral | No non-linear dimensionality |
| Meguid [ | Random forest classifiers | AFEW JAFFE-CK | 44.53%—AFEW | Happiness, Disgust, | Assumes that the progression |
| Zhang [ | SVR based AU intensity | C-K | 90.38% | Happiness, Angry, | —— |
| Zhang [ | NN based Facial emotion | C-K | 75.83% | Happiness, Disgust, | Weak affect indicator embedded |
| Wu [ | Gabor motion energy filters | C-K | 78.6% | Happiness, Angry, | Low accuracy |
| Jain [ | Latent-Dynamic Conditional | C-K | 85.84% | Happiness, Disgust, | —— |
| Shan [ | Local Binary Patterns, | C-K | 89.14% | Happiness, Disgust, | Recognition performed using |
| Li [ | PCA, LDA and SVM | 29 Subjects | 3D Database—Above 90% | Happiness, Sadness, | Small sized-database used |
| Mohammed and | Patched geodesic texture | JAFFE BU—3DFE | Angry—90%—JAFFE | Happiness, Disgust, | Consideration of few emotions |
| Rivera [ | local directional | 29 Subjects | 92.9% | Happiness, Sadness, | Very small number of |
Comparison of emotion recognition using the HoloLens and a webcam.
| Emotion Recognition with Camera | Emotion Recognition with Hololens | Emotion Recognition with Camera | Emotion Recognition with Hololens |
|---|---|---|---|
| 1 | 2 | 3 | 4 |
| 5 | 6 | 7 | 8 |
| 9 | 10 | 11 | 12 |
| 13 | 14 | 15 | 16 |
| 17 | 18 | 19 | 20 |
| 21 | 22 | 23 | 24 |
| 25 | 26 | 27 | 28 |
| 29 | 30 |
Figure 4Comparison of average emotion recognition accuracy for the five emotions for Subject 1 using MHL and webcam.
Figure 5Comparison of average emotion recognition accuracy for the five emotions for Subject 2 using MHL and webcam.
Figure 6Comparison of average emotion recognition accuracy for the five emotions for Subject 3 using MHL and webcam.
Figure 7Confusion matrix of webcam-based emotion recognition results for the complete dataset.
Figure 8Confusion matrix of MHL-based emotion recognition results for the complete dataset.