Emma M Parrish1, Tess F Filip2, John Torous3, Camille Nebeker4,5,6, Raeanne C Moore2,4, Colin A Depp2,7. 1. San Diego Joint Doctoral Program in Clinical Psychology, San Diego State University/University of California, San Diego, CA, USA. 2. Department of Psychiatry, University of California San Diego (UCSD), CA, USA. 3. Department of Psychiatry, Beth Israel Deaconess Medical Center, Harvard Medical School, Boston, MA, USA. 4. Center for Wireless and Population Health Systems, University of California San Diego, La Jolla, CA, USA. 5. Department of Family Medicine and Public Health, School of Medicine, University of California San Diego, La Jolla, CA, USA. 6. Research Center for Optimal Digital Ethics in Health (ReCODE.Health), Qualcomm Institute and School of Medicine, University of California San Diego, La Jolla, CA, USA. 7. VA San Diego Healthcare System, San Diego, CA, USA.
Abstract
Background: Mental health (MH) apps are growing in popularity. While MH apps may be helpful, less is known about how crises such as suicidal ideation are addressed in apps. Aims: We examined the proportion of MH apps that contained language mentioning suicide or suicidal ideation and how apps communicated these policies and directed users to MH resources through app content, terms of services, and privacy policies. Method: We chose apps using an Internet search of "top mental health apps," similar to how a user might find an app, and extracted information about how crisis language was presented in these apps. Results: We found that crisis language was inconsistent among apps. Overall, 35% of apps provided crisis-specific resources in their app interface and 10.5% contained crisis language in terms of service or privacy policies. Limitations: This study employed a nonsystematic approach to sampling apps, and therefore the findings may not broadly represent apps for MH. Conclusion: To address the inconsistency of crisis resources, crisis language should be included as part of app evaluation frameworks, and internationally accessible, vetted resources should be provided to app users.
Background: Mental health (MH) apps are growing in popularity. While MH apps may be helpful, less is known about how crises such as suicidal ideation are addressed in apps. Aims: We examined the proportion of MH apps that contained language mentioning suicide or suicidal ideation and how apps communicated these policies and directed users to MH resources through app content, terms of services, and privacy policies. Method: We chose apps using an Internet search of "top mental health apps," similar to how a user might find an app, and extracted information about how crisis language was presented in these apps. Results: We found that crisis language was inconsistent among apps. Overall, 35% of apps provided crisis-specific resources in their app interface and 10.5% contained crisis language in terms of service or privacy policies. Limitations: This study employed a nonsystematic approach to sampling apps, and therefore the findings may not broadly represent apps for MH. Conclusion: To address the inconsistency of crisis resources, crisis language should be included as part of app evaluation frameworks, and internationally accessible, vetted resources should be provided to app users.
Entities:
Keywords:
digital health ethics; mental health treatment; mobile health; self-help; suicidal ideation; telehealth
Authors: Ruth Melia; Kady Francis; Emma Hickey; John Bogue; Jim Duggan; Mary O'Sullivan; Karen Young Journal: JMIR Mhealth Uhealth Date: 2020-01-15 Impact factor: 4.773
Authors: Derek D Satre; Meredith C Meacham; Lauren D Asarnow; Weston S Fisher; Lisa R Fortuna; Esti Iturralde Journal: Am J Health Promot Date: 2021-10-15