Andrea Bandini1, Silvia Orlandi2, Hugo Jair Escalante3, Fabio Giovannelli4, Massimo Cincotta5, Carlos A Reyes-Garcia6, Paola Vanni7, Gaetano Zaccara8, Claudia Manfredi9. 1. Department of Information Engineering, Università degli Studi di Firenze, Via di S. Marta 3, 50139 Firenze, Italy; Department of Electrical, Electronic and Information Engineering (DEI) "Guglielmo Marconi", Università di Bologna, Viale del Risorgimento 2, 40136, Bologna, Italy. Electronic address: andrea.bandini@uhn.ca. 2. Department of Information Engineering, Università degli Studi di Firenze, Via di S. Marta 3, 50139 Firenze, Italy. Electronic address: silvia.orlandi@unifi.it. 3. Instituto Nacional de Astrofísica, Óptica y Electrónica (INAOE), Computer Science Department, Luis Enrique Erro No. 1, Tonantzintla, Puebla 72840, Mexico. Electronic address: hugojair@inaoep.mx. 4. Unit of Neurology, Florence Health Autority, Ospedale "Nuovo San Giovanni di Dio", Via Torregalli 3, Firenze, Italy. Electronic address: fabio.giovannelli@unifi.it. 5. Unit of Neurology, Florence Health Autority, Ospedale "Nuovo San Giovanni di Dio", Via Torregalli 3, Firenze, Italy. Electronic address: massimo.cincotta@uslcentro.toscana.it. 6. Instituto Nacional de Astrofísica, Óptica y Electrónica (INAOE), Computer Science Department, Luis Enrique Erro No. 1, Tonantzintla, Puebla 72840, Mexico. Electronic address: kargaxxi@inaoep.mx. 7. Unit of Neurology, Florence Health Autority, Ospedale "Nuovo San Giovanni di Dio", Via Torregalli 3, Firenze, Italy. Electronic address: paola.vanni@uslcentro.toscana.it. 8. Unit of Neurology, Florence Health Autority, Ospedale "Nuovo San Giovanni di Dio", Via Torregalli 3, Firenze, Italy. Electronic address: gaetano.zaccara@uslcentro.toscana.it. 9. Department of Information Engineering, Università degli Studi di Firenze, Via di S. Marta 3, 50139 Firenze, Italy. Electronic address: claudia.manfredi@unifi.it.
Abstract
BACKGROUND: The automatic analysis of facial expressions is an evolving field that finds several clinical applications. One of these applications is the study of facial bradykinesia in Parkinson's disease (PD), which is a major motor sign of this neurodegenerative illness. Facial bradykinesia consists in the reduction/loss of facial movements and emotional facial expressions called hypomimia. NEW METHOD: In this work we propose an automatic method for studying facial expressions in PD patients relying on video-based METHODS: 17 Parkinsonian patients and 17 healthy control subjects were asked to show basic facial expressions, upon request of the clinician and after the imitation of a visual cue on a screen. Through an existing face tracker, the Euclidean distance of the facial model from a neutral baseline was computed in order to quantify the changes in facial expressivity during the tasks. Moreover, an automatic facial expressions recognition algorithm was trained in order to study how PD expressions differed from the standard expressions. RESULTS: Results show that control subjects reported on average higher distances than PD patients along the tasks. COMPARISON WITH EXISTING METHODS: This confirms that control subjects show larger movements during both posed and imitated facial expressions. Moreover, our results demonstrate that anger and disgust are the two most impaired expressions in PD patients. CONCLUSIONS: Contactless video-based systems can be important techniques for analyzing facial expressions also in rehabilitation, in particular speech therapy, where patients could get a definite advantage from a real-time feedback about the proper facial expressions/movements to perform.
BACKGROUND: The automatic analysis of facial expressions is an evolving field that finds several clinical applications. One of these applications is the study of facial bradykinesia in Parkinson's disease (PD), which is a major motor sign of this neurodegenerative illness. Facial bradykinesia consists in the reduction/loss of facial movements and emotional facial expressions called hypomimia. NEW METHOD: In this work we propose an automatic method for studying facial expressions in PD patients relying on video-based METHODS: 17 Parkinsonian patients and 17 healthy control subjects were asked to show basic facial expressions, upon request of the clinician and after the imitation of a visual cue on a screen. Through an existing face tracker, the Euclidean distance of the facial model from a neutral baseline was computed in order to quantify the changes in facial expressivity during the tasks. Moreover, an automatic facial expressions recognition algorithm was trained in order to study how PD expressions differed from the standard expressions. RESULTS: Results show that control subjects reported on average higher distances than PD patients along the tasks. COMPARISON WITH EXISTING METHODS: This confirms that control subjects show larger movements during both posed and imitated facial expressions. Moreover, our results demonstrate that anger and disgust are the two most impaired expressions in PD patients. CONCLUSIONS: Contactless video-based systems can be important techniques for analyzing facial expressions also in rehabilitation, in particular speech therapy, where patients could get a definite advantage from a real-time feedback about the proper facial expressions/movements to perform.
Authors: Andrea Bandini; Jordan R Green; Jun Wang; Thomas F Campbell; Lorne Zinman; Yana Yunusova Journal: J Speech Lang Hear Res Date: 2018-05-17 Impact factor: 2.297
Authors: Julian Varghese; Catharina Marie van Alen; Michael Fujarski; Georg Stefan Schlake; Julitta Sucker; Tobias Warnecke; Christine Thomas Journal: Sensors (Basel) Date: 2021-04-30 Impact factor: 3.576
Authors: Kalpana M Merchant; Jesse M Cedarbaum; Patrik Brundin; Kuldip D Dave; Jamie Eberling; Alberto J Espay; Samantha J Hutten; Monica Javidnia; Johan Luthman; Walter Maetzler; Liliana Menalled; Alyssa N Reimer; A Jon Stoessl; David M Weiner Journal: J Parkinsons Dis Date: 2019 Impact factor: 5.568
Authors: Andrea Bandini; Sia Rezaei; Diego L Guarin; Madhura Kulkarni; Derrick Lim; Mark I Boulos; Lorne Zinman; Yana Yunusova; Babak Taati Journal: IEEE J Biomed Health Inform Date: 2021-04-06 Impact factor: 5.772