| Literature DB >> 36147984 |
Katherine C Kellogg1, Shiri Sadeh-Sharvit2,3.
Abstract
The integration of artificial intelligence (AI) technologies into mental health holds the promise of increasing patient access, engagement, and quality of care, and of improving clinician quality of work life. However, to date, studies of AI technologies in mental health have focused primarily on challenges that policymakers, clinical leaders, and data and computer scientists face, rather than on challenges that frontline mental health clinicians are likely to face as they attempt to integrate AI-based technologies into their everyday clinical practice. In this Perspective, we describe a framework for "pragmatic AI-augmentation" that addresses these issues by describing three categories of emerging AI-based mental health technologies which frontline clinicians can leverage in their clinical practice-automation, engagement, and clinical decision support technologies. We elaborate the potential benefits offered by these technologies, the likely day-to-day challenges they may raise for mental health clinicians, and some solutions that clinical leaders and technology developers can use to address these challenges, based on emerging experience with the integration of AI technologies into clinician daily practice in other healthcare disciplines.Entities:
Keywords: AI-augmentation; artificial intelligence; automation technologies; clinical practice; decision support technologies; engagement technologies; mental healthcare
Year: 2022 PMID: 36147984 PMCID: PMC9485594 DOI: 10.3389/fpsyt.2022.990370
Source DB: PubMed Journal: Front Psychiatry ISSN: 1664-0640 Impact factor: 5.435
Key AI technologies, example applications in mental healthcare, and potential benefits.
|
|
|
| |
|---|---|---|---|
| Key AI technologies | • Computer vision •Machine learning | • Conversational Agents | • Machine learning |
| Example applications in MH | • Screening for autism, eating disorders | • Coaching for smoking cessation, exercise, nutrition | • Depression and anxiety prediction |
| Potential benefits | • Increase early screening | • Increase patient access | • Help clinicians identify mental illnesses at an earlier stage when interventions may be more effective |
| AI-augmentation in MH practice | • Automation technologies could assist alleviate some of the severe staffing shortages in mental healthcare by extending rather than replacing the knowledge and expertise of human clinicians | • AI engagement tools could complement the therapist's interventions in a blended therapy model. To support skill practice between sessions, for example, the therapist could assign a specific GSH component, extending therapy beyond the meeting | • To increase clinicians' readiness to consider using AI tools to help decision-making, there has to be more openness about how algorithms are developed, the data utilized for their creation, and the engagement of mental health practitioners and service users in their evaluation and improvement |
AI, artificial intelligence; MH, mental health; GSH, guided self-help.
Challenges and potential solutions related to pragmatic AI-augmentation.
|
|
|
| |
|---|---|---|---|
|
| |||
| • Work practices | • Automation complacency | • Required workflow changes | • Lack of interpretability of AI recommendations |
| • Beliefs and identity | • Fear of obsolescence | • Fear of negative impact on clinician-patient therapeutic alliance | • Concerns about model accuracy and bias |
| • Tasks and roles | • Clinician-AI task allocation | • Concerns about patient overreliance on technology | • Clinician surveillance concerns |
|
| |||
| • Work practices | • Provide proactive risk assessment | • Engage clinicians in iterative technology development and implementation process | • Train clinicians in computational thinking |
| • Beliefs and identity | • Provide more accurate portrayals of capabilities of AI technologies | • Discuss potential risks related to the patient becoming too emotionally attached | • Use human-in-the-loop development |
| • Tasks and roles | • Add organizational structures and roles to govern AI projects | • Automate detection of threat, and provide recommended action | • Configure AI technologies so that predictions do not infringe upon clinicians' core tasks |
AI, artificial intelligence.