OBJECTIVE: Alzheimer's disease (AD) is a neurodegenerative disorder that initially presents with memory loss in the presence of underlying neurofibrillary tangle and amyloid plaque pathology. Mild cognitive impairment is the initial symptomatic stage, which is an early window for detecting cognitive impairment prior to progressive decline and dementia. We recently developed the Visuospatial Memory Eye-Tracking Test (VisMET), a passive task capable of classifying cognitive impairment in AD in under five minutes. Here we describe the development of a mobile version of VisMET to enable efficient and widespread administration of the task. METHODS: We delivered VisMET on iPad devices and used a transfer learning approach to train a deep neural network to track eye gaze. Eye movements were used to extract memory features to assess cognitive status in a population of 250 individuals. RESULTS: Mild to severe cognitive impairment was identifiable with a test accuracy of 70%. By enforcing a minimal eye tracking calibration error of 2 cm, we achieved an accuracy of 76% which is equivalent to the accuracy obtained using commercial hardware for eye-tracking. CONCLUSION: This work demonstrates a mobile version of VisMET capable of estimating the presence of cognitive impairment. SIGNIFICANCE: Given the ubiquity of tablet devices, our approach has the potential to scale globally.
OBJECTIVE: Alzheimer's disease (AD) is a neurodegenerative disorder that initially presents with memory loss in the presence of underlying neurofibrillary tangle and amyloid plaque pathology. Mild cognitive impairment is the initial symptomatic stage, which is an early window for detecting cognitive impairment prior to progressive decline and dementia. We recently developed the Visuospatial Memory Eye-Tracking Test (VisMET), a passive task capable of classifying cognitive impairment in AD in under five minutes. Here we describe the development of a mobile version of VisMET to enable efficient and widespread administration of the task. METHODS: We delivered VisMET on iPad devices and used a transfer learning approach to train a deep neural network to track eye gaze. Eye movements were used to extract memory features to assess cognitive status in a population of 250 individuals. RESULTS: Mild to severe cognitive impairment was identifiable with a test accuracy of 70%. By enforcing a minimal eye tracking calibration error of 2 cm, we achieved an accuracy of 76% which is equivalent to the accuracy obtained using commercial hardware for eye-tracking. CONCLUSION: This work demonstrates a mobile version of VisMET capable of estimating the presence of cognitive impairment. SIGNIFICANCE: Given the ubiquity of tablet devices, our approach has the potential to scale globally.
Authors: Ziad S Nasreddine; Natalie A Phillips; Valérie Bédirian; Simon Charbonneau; Victor Whitehead; Isabelle Collin; Jeffrey L Cummings; Howard Chertkow Journal: J Am Geriatr Soc Date: 2005-04 Impact factor: 5.562
Authors: R C Petersen; P S Aisen; L A Beckett; M C Donohue; A C Gamst; D J Harvey; C R Jack; W J Jagust; L M Shaw; A W Toga; J Q Trojanowski; M W Weiner Journal: Neurology Date: 2009-12-30 Impact factor: 9.910
Authors: Jocelyne C Whitehead; Sara A Gambino; Jeffrey D Richter; Jennifer D Ryan Journal: Neuropsychiatr Dis Treat Date: 2015-06-12 Impact factor: 2.570
Authors: Emma Borland; Katarina Nägga; Peter M Nilsson; Lennart Minthon; Erik D Nilsson; Sebastian Palmqvist Journal: J Alzheimers Dis Date: 2017 Impact factor: 4.472
Authors: Rafi U Haque; Cecelia M Manzanares; Lavonda N Brown; Alvince L Pongos; James J Lah; Gari D Clifford; Allan I Levey Journal: Learn Mem Date: 2019-02-15 Impact factor: 2.460
Authors: Zifan Jiang; Salman Seyedi; Rafi U Haque; Alvince L Pongos; Kayci L Vickers; Cecelia M Manzanares; James J Lah; Allan I Levey; Gari D Clifford Journal: PLoS One Date: 2022-01-21 Impact factor: 3.240