| Literature DB >> 32515552 |
Kwang Hyeon Kim1, Kyeongyun Park1, Haksoo Kim1, Byungdu Jo1, Sang Hee Ahn1, Chankyu Kim1, Myeongsoo Kim1, Tae Ho Kim1, Se Byeong Lee1, Dongho Shin1, Young Kyung Lim1, Jong Hwi Jeong1.
Abstract
PURPOSE: Imaging, breath-holding/gating, and fixation devices have been developed to minimize setup errors so that the prescribed dose can be exactly delivered to the target volume in radiotherapy. Despite these efforts, additional patient monitoring devices have been installed in the treatment room to view patients' whole-body movement. We developed a facial expression recognition system using deep learning with a convolutional neural network (CNN) to predict patients' advanced movement, enhancing the stability of the radiation treatment by giving warning signs to radiation therapists.Entities:
Keywords: Radiotherapy; convolutional neural network; facial expression recognition; patient monitoring system; predicting body movement
Mesh:
Year: 2020 PMID: 32515552 PMCID: PMC7484824 DOI: 10.1002/acm2.12945
Source DB: PubMed Journal: J Appl Clin Med Phys ISSN: 1526-9914 Impact factor: 2.102
Face recognition researches using facial images.
| Authors | Goal | Result | Algorithm | Facial database | Year |
|---|---|---|---|---|---|
| Shakya et al. | Human behavior prediction | Emotion analysis and its appropriate behavior prediction | Machine vision (PCA) | Extended Cohn‐Kanade dataset (CK+) | 2016 |
| Benitez‐Quiroz et al. | Automatic annotation of facial expressions in the wild | Emotional annotation for a million facial expression images | Machine vision (Shape feature) | EmotioNet | 2016 |
| Pantic et al. | Web‐based database for facial expression analysis | automatic analysis of facial expressions | Facial action coding system (FACS) | MMI facial expression dataset | 2005 |
| Ekman | Research for emotion in the human face | Face‐emotion connection and its judgment and analysis for psychology, anthropology, ethology, sociology, and biology | N/A | Sampling persons | 1972 |
Fig. 1A facial expression monitoring system for patients in a radiation treatment room.
Fig. 2System implementation and a user interface.
Fig. 3A developed convolutional neural network architecture.
Fig. 4Implemented system algorithm using the convolutional neural network and the entire process for the whole treatment.
Fig. 5Sample facial images from 447 images (©Jeffrey Cohn). *Notation: Red = dominant emotion for the facial expression score over 50% for each frame.
Facial expression composition.
| Neutral | Angry | Contemptuous | Disgusted | Frightened | Happy | Sad | Surprised | Total |
|---|---|---|---|---|---|---|---|---|
| 121 | 45 | 18 | 59 | 25 | 69 | 28 | 82 | 447 |
| 27.07% | 10.07% | 4.03% | 13.20% | 5.59% | 15.44% | 6.26% | 18.34% | 100% |
Fig. 6Facial expression monitoring result by frame sequence (n = 10).
Facial recognition results.
| n | Detection | Dominant emotion (%) | Stability (%) | Expected nonmotion (%) | Expected motion (%) | Real motion result | Motion prediction accuracy (%) |
|---|---|---|---|---|---|---|---|
| 1 | Y | Sad (53) | 88.6 | 1.0 | 99.0 | N | 1.0 |
| 2 | Y | Surprised (67.2) | 48.9 | 0.2 | 99.8 | N | 0.2 |
| 3 | Y | Happy (54.1) | 84.8 | 92.7 | 7.3 | N | 92.7 |
| 4 | Y | Disgusted (77.4) | 29.2 | 5.8 | 94.2 | N | 5.8 |
| 5 | Y | Sad (63.6) | 57.3 | 5.4 | 94.6 | N | 5.4 |
| 6 | Y | Disgusted (86.1) | 16.1 | 11.1 | 89.0 | N | 11.1 |
| 7 | Y | Surprised (55.3) | 80.8 | 27.7 | 72.3 | N | 27.7 |
| 8 | Y | Disgusted (72.6) | 37.8 | 2.8 | 97.2 | N | 2.8 |
| 9 | Y | Surprised (72.6) | 37.8 | 24.6 | 75.4 | N | 24.6 |
| 10 | Y | Neutral (55.7) | 79.7 | 75.5 | 24.5 | N | 75.5 |
Notation: Stability = Dominant emotion/(1 − dominant emotion).
Fig. 7Training, validation accuracy, and receiver operating characteristic curve for facial emotions.