Literature DB >> 31021433

The spectrum of facial palsy: The MEEI facial palsy photo and video standard set.

Jacqueline J Greene1, Diego L Guarin1, Joana Tavares1, Emily Fortier1, Mara Robinson1, Joseph Dusseldorp1, Olivia Quatela1, Nate Jowett1, Tessa Hadlock1.   

Abstract

OBJECTIVES: Facial palsy causes variable facial disfigurement ranging from subtle asymmetry to crippling deformity. There is no existing standard database to serve as a resource for facial palsy education and research. We present a standardized set of facial photographs and videos representing the entire spectrum of flaccid and nonflaccid (aberrantly regenerated or synkinetic) facial palsy. To demonstrate the utility of the dataset, we describe the relationship between level of facial function and perceived emotion expression as determined by an automated emotion detection, machine learning-based algorithm.
METHODS: Photographs and videos of patients with both flaccid and nonflaccid facial palsy were prospectively gathered. The degree of facial palsy was quantified using eFACE, House-Brackmann, and Sunnybrook scales. Perceived emotion during a standard video of facial movements was determined using an automated, machine learning algorithm.
RESULTS: Sixty participants were enrolled and categorized by eFACE score across the range of facial function. Patients with complete flaccid facial palsy (eFACE <60) had a significant loss of perceived joy compared to the nonflaccid and normal groups. Additionally, patients with only moderate flaccid and nonflaccid facial palsy had a significant increase in perceived negative emotion (contempt) when compared to the normal group.
CONCLUSION: We provide this open-source database to assist in comparing current and future scales of facial function as well as facilitate comprehensive investigation of the entire spectrum of facial palsy. The automated machine learning-based algorithm detected negative emotions at moderate levels of facial palsy and suggested a threshold severity of flaccid facial palsy beyond which joy was not perceived. LEVEL OF EVIDENCE: NA Laryngoscope, 130:32-37, 2020.
© 2019 The American Laryngological, Rhinological and Otological Society, Inc.

Entities:  

Keywords:  Facial palsy; affectiva; emotion; facial expression; facial paralysis; joy perception; machine learning; nonflaccid facial palsy; standard set

Year:  2019        PMID: 31021433     DOI: 10.1002/lary.27986

Source DB:  PubMed          Journal:  Laryngoscope        ISSN: 0023-852X            Impact factor:   3.325


  3 in total

1.  Toward an Automatic System for Computer-Aided Assessment in Facial Palsy.

Authors:  Diego L Guarin; Yana Yunusova; Babak Taati; Joseph R Dusseldorp; Suresh Mohan; Joana Tavares; Martinus M van Veen; Emily Fortier; Tessa A Hadlock; Nate Jowett
Journal:  Facial Plast Surg Aesthet Med       Date:  2020 Jan/Feb

2.  Towards Facial Gesture Recognition in Photographs of Patients with Facial Palsy.

Authors:  Gemma S Parra-Dominguez; Raul E Sanchez-Yanez; Carlos H Garcia-Capulin
Journal:  Healthcare (Basel)       Date:  2022-03-31

3.  A New Dataset for Facial Motion Analysis in Individuals With Neurological Disorders.

Authors:  Andrea Bandini; Sia Rezaei; Diego L Guarin; Madhura Kulkarni; Derrick Lim; Mark I Boulos; Lorne Zinman; Yana Yunusova; Babak Taati
Journal:  IEEE J Biomed Health Inform       Date:  2021-04-06       Impact factor: 5.772

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.