Literature DB >> 33332380

Artificial Intelligence in mental health and the biases of language based models.

Isabel Straw1, Chris Callison-Burch2.   

Abstract

BACKGROUND: The rapid integration of Artificial Intelligence (AI) into the healthcare field has occurred with little communication between computer scientists and doctors. The impact of AI on health outcomes and inequalities calls for health professionals and data scientists to make a collaborative effort to ensure historic health disparities are not encoded into the future. We present a study that evaluates bias in existing Natural Language Processing (NLP) models used in psychiatry and discuss how these biases may widen health inequalities. Our approach systematically evaluates each stage of model development to explore how biases arise from a clinical, data science and linguistic perspective. DESIGN/
METHODS: A literature review of the uses of NLP in mental health was carried out across multiple disciplinary databases with defined Mesh terms and keywords. Our primary analysis evaluated biases within 'GloVe' and 'Word2Vec' word embeddings. Euclidean distances were measured to assess relationships between psychiatric terms and demographic labels, and vector similarity functions were used to solve analogy questions relating to mental health.
RESULTS: Our primary analysis of mental health terminology in GloVe and Word2Vec embeddings demonstrated significant biases with respect to religion, race, gender, nationality, sexuality and age. Our literature review returned 52 papers, of which none addressed all the areas of possible bias that we identify in model development. In addition, only one article existed on more than one research database, demonstrating the isolation of research within disciplinary silos and inhibiting cross-disciplinary collaboration or communication.
CONCLUSION: Our findings are relevant to professionals who wish to minimize the health inequalities that may arise as a result of AI and data-driven algorithms. We offer primary research identifying biases within these technologies and provide recommendations for avoiding these harms in the future.

Entities:  

Year:  2020        PMID: 33332380     DOI: 10.1371/journal.pone.0240376

Source DB:  PubMed          Journal:  PLoS One        ISSN: 1932-6203            Impact factor:   3.240


  6 in total

Review 1.  Evaluation and Mitigation of Racial Bias in Clinical Machine Learning Models: Scoping Review.

Authors:  Jonathan Huang; Galal Galal; Mozziyar Etemadi; Mahesh Vaidyanathan
Journal:  JMIR Med Inform       Date:  2022-05-31

Review 2.  Investigating for bias in healthcare algorithms: a sex-stratified analysis of supervised machine learning models in liver disease prediction.

Authors:  Isabel Straw; Honghan Wu
Journal:  BMJ Health Care Inform       Date:  2022-04

Review 3.  Artificial intelligence: A rapid case for advancement in the personalization of Gynaecology/Obstetric and Mental Health care.

Authors:  Gayathri Delanerolle; Xuzhi Yang; Suchith Shetty; Vanessa Raymont; Ashish Shetty; Peter Phiri; Dharani K Hapangama; Nicola Tempest; Kingshuk Majumder; Jian Qing Shi
Journal:  Womens Health (Lond)       Date:  2021 Jan-Dec

4.  Integration and Validation of a Natural Language Processing Machine Learning Suicide Risk Prediction Model Based on Open-Ended Interview Language in the Emergency Department.

Authors:  Joshua Cohen; Jennifer Wright-Berryman; Lesley Rohlfs; Douglas Trocinski; LaMonica Daniel; Thomas W Klatt
Journal:  Front Digit Health       Date:  2022-02-02

5.  Multimodal Assessment of Schizophrenia and Depression Utilizing Video, Acoustic, Locomotor, Electroencephalographic, and Heart Rate Technology: Protocol for an Observational Study.

Authors:  Robert O Cotes; Mina Boazak; Emily Griner; Zifan Jiang; Bona Kim; Whitney Bremer; Salman Seyedi; Ali Bahrami Rad; Gari D Clifford
Journal:  JMIR Res Protoc       Date:  2022-07-13

6.  Public patient views of artificial intelligence in healthcare: A nominal group technique study.

Authors:  Omar Musbahi; Labib Syed; Peter Le Feuvre; Justin Cobb; Gareth Jones
Journal:  Digit Health       Date:  2021-12-15
  6 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.