| Literature DB >> 35002802 |
Erin Smith1,2,3,4,5, Eric A Storch6, Ipsit Vahia7,8, Stephen T C Wong9, Helen Lavretsky10, Jeffrey L Cummings11, Harris A Eyre1,2,4,5,6,12.
Abstract
Affective computing (also referred to as artificial emotion intelligence or emotion AI) is the study and development of systems and devices that can recognize, interpret, process, and simulate emotion or other affective phenomena. With the rapid growth in the aging population around the world, affective computing has immense potential to benefit the treatment and care of late-life mood and cognitive disorders. For late-life depression, affective computing ranging from vocal biomarkers to facial expressions to social media behavioral analysis can be used to address inadequacies of current screening and diagnostic approaches, mitigate loneliness and isolation, provide more personalized treatment approaches, and detect risk of suicide. Similarly, for Alzheimer's disease, eye movement analysis, vocal biomarkers, and driving and behavior can provide objective biomarkers for early identification and monitoring, allow more comprehensive understanding of daily life and disease fluctuations, and facilitate an understanding of behavioral and psychological symptoms such as agitation. To optimize the utility of affective computing while mitigating potential risks and ensure responsible development, ethical development of affective computing applications for late-life mood and cognitive disorders is needed.Entities:
Keywords: Alzheimer's disease; affective computing; dementia; digital phenotyping; late-life depression
Year: 2021 PMID: 35002802 PMCID: PMC8732874 DOI: 10.3389/fpsyt.2021.782183
Source DB: PubMed Journal: Front Psychiatry ISSN: 1664-0640 Impact factor: 4.157
Affective computing applications for clinical challenges in late-life depression.
|
|
|
|
|
|
|
| |
|---|---|---|---|---|---|---|---|
|
| |||||||
| Inadequacies of current screening and diagnostic approaches | Use vocal biomarkers to detect depression ( | Use facial expression biomarkers to detect depression ( | Use head movements and pose to detect depression ( | Use gaze and eye movement to detect depression to detect behavioral consistent with depression ( | Use keystroke dynamics to detect typing behavior associated with depression ( | Use behavioral attributes related to social engagement, emotion, language, linguistic style, and writing aspects to detect depression ( | Use SARs to administer screening and diagnostic approaches that leverage affective computing biomarkers ( |
| Trial-and-error treatment approaches | Monitor depression severity and treatment response to determine optimal treatment using vocal biomarkers ( | Monitor depression severity and treatment response to determine optimal treatment using facial expression biomarkers ( | Leverage body and head movement analysis to measure depression severity throughout the course of treatment to determine optimal approach ( | Measure depression treatment response using anti-saccade eye movement tasks ( | Determine optimal treatment by measuring depression severity via touchscreen typing ( | Assess social media behavior and develop treatment strategy for social media usage that reinforces depressive beliefs and symptoms to improve overall treatment outcomes ( | Use SARs to provide in-home therapeutic approach and collect real-time data on affective computing biomarkers to determine optimal treatment strategies ( |
| Loneliness and social isolation | Detect loneliness and social isolation and better identify behavioral phenotypes of loneliness and social isolation through vocal biomarkers ( | Assess spontaneous smile mimicry to detect and monitor loneliness ( | Assess body movement coordination, which may be impaired during loneliness due to changes in the left posterior superior temporal sulcus ( | Assess eye movement, which may be impaired during loneliness due to changes in the left posterior superior temporal sulcus ( | Assess keystroke dynamics and hand action, which may be impaired during loneliness due to changes in the left posterior superior temporal sulcus ( | Monitor social media behavior for early detection of loneliness ( | Use socially assistive robots (SARs) or social companion robots that have affective computing capabilities to help older adults with depression ( |
| Poor treatment follow-up | Monitor daily fluctuations using vocal biomarkers during time outside of the clinic and receive alerts if problems with treatment ( | Monitor daily fluctuations using facial expression biomarkers during time outside of the clinic and receive alerts if problems with treatment ( | Assess body movement during time outside of the clinic to better understand symptom fluctuations ( | Track eye movement to understand depression symptoms between in-clinic visits ( | Monitor keystroke behavior to assess treatment efficacy ( | Monitor behavioral attributes related to social engagement, emotion, language, linguistic style, and writing aspects to monitor depression symptom severity between treatment sessions or clinic visits ( | Use SARs to interact with older adults and understand depression symptom progression and severity ( |
| Co-occurrence with anxiety disorders | Differentiate between depression and anxiety disorders using vocal biomarkers | Differentiate between depression and anxiety disorders using facial expression biomarkers | Identify and monitor anxiety and depression severity scores using digital gait movement ( | Assess anxiety and depression severity using eye movement ( | Monitor and differentiate between anxiety and depression via touchscreen typing ( | Detect anxiety and depression via social media behavior ( | Capture affective computing biomarkers to differentiate between anxiety and depression via SARs ( |
| Co-occurrence with Alzheimer's disease | Monitor symptoms associated with depression and Alzheimer's disease using vocal biomarkers ( | Monitor symptoms associated with depression and Alzheimer's disease using facial expression biomarkers ( | Kinematic analysis can detect co-morbid Alzheimer's disease for patients with depression ( | Detect and differentiate between Alzheimer's and depression via eye movement tracking ( | Monitor early stages of Alzheimer's disease and depression via touchscreen typing ( | Detect late-life depression in Alzheimer's disease patients via speech and language analysis on social media ( | Use SARs to analyze speech and language of people with late-life depression and Alzheimer's disease ( |
| Risk of suicide | Detect suicidal ideation and risk using vocal biomarkers ( | Evaluate risk of suicide using facial expression biomarkers ( | Monitor body movement to detect risk of suicide ( | Assess eye movement to identify attention bias for suicide related stimuli ( | Use digital phenotyping from smartphone typing to detect risk of suicide ( | Detect suicidal ideation and risk using social media behavioral analysis ( | For people at risk of suicide, SARs can be leveraged to track affective computing biomarkers that indicate risk of suicide ( |
Affective computing applications for clinical challenges in AD.
|
|
|
|
|
| |
|---|---|---|---|---|---|
|
| |||||
| Lack of early, objective screening and diagnostic approaches | Use vocal biomarkers to detect and access mild cognitive impairment (MCI) and prodromal stages of AD ( | Assess cognitive and neuropsychiatric symptoms of AD using facial expression impairments ( | Capture motor impairments that precede signs of cognitive impairment by over a decade in people with AD through measuring gait speed, stride length, and gait symmetry ( | Use eye movement to detect MCI and prodromal stages of AD ( | Capture differences in reaction speed and movement that have been found in early stages of AD using an active finger tapping test or passive data collection from daily computer, tablet, or smartphone keyboard use ( |
| Lack of objective biomarkers for monitoring disease progression and comprehensive, daily fluctuations | Use vocal biomarkers to monitor disease progression ( | Monitor diseases progression and daily fluctuations of symptoms using facial expressions ( | Assess gait and balance throughout AD to monitor disease progression ( | Monitor eye movement to track AD progression ( | Monitor progression of cognitive impairment including MCI to AD using touchscreen typing ( |
| Understanding and addressing the behavioral and psychological symptoms, such as agitation and pain, experienced by patients with AD | Capture vocal biomarkers using sensing technology to monitor behavioral and psychological symptoms of AD ( | Digitize facial expressions and movements to monitor behavioral and psychological symptoms of AD using sensing technology ( | Use body movements, such as number of transitions between spaces, to detect and better understand different behavioral and psychological symptoms of AD ( | Use eye movements to understand and monitor behavioral and psychological symptoms of AD via sensing technology ( | Analyze typing to identify subtypes of AD based on the presence and intensity of behavioral and psychological symptoms of AD |
| Co-occurrence with depression | Monitor symptoms associated with depression and AD using vocal biomarkers ( | Monitor symptoms associated with depression and AD using facial expression biomarkers ( | Kinematic analysis can detect co-morbid AD for patients with depression ( | Detect and differentiate between AD and depression via eye movement tracking ( | Monitor early stages of AD and depression via touchscreen typing ( |
| Misdiagnosis between AD and other neurodegenerative disorders during early stages of disease progression | Differentiate between AD, Parkinson's disease (AD), and Lewy Body Disease (LBDs) using vocal biomarkers ( | Differentiate between AD and PD using facial expression analysis | Differentiate between AD and PD using digital gait analysis ( | Assess PD, AD, and Lewy body dementia via eye movement analysis ( | Detect and differentiate early stages of PD and AD using typing and keyboard dynamics ( |