| Literature DB >> 32529043 |
Davide Cirillo1, Silvina Catuara-Solarz2,3, Czuee Morey3,4, Emre Guney5, Laia Subirats6,7, Simona Mellino3, Annalisa Gigante3, Alfonso Valencia1,8, María José Rementeria1, Antonella Santuccione Chadha3, Nikolaos Mavridis3,9.
Abstract
Precision Medicine implies a deep understanding of inter-individual differences in health and disease that are due to genetic and environmental factors. To acquire such understanding there is a need for the implementation of different types of technologies based on artificial intelligence (AI) that enable the identification of biomedically relevant patterns, facilitating progress towards individually tailored preventative and therapeutic interventions. Despite the significant scientific advances achieved so far, most of the currently used biomedical AI technologies do not account for bias detection. Furthermore, the design of the majority of algorithms ignore the sex and gender dimension and its contribution to health and disease differences among individuals. Failure in accounting for these differences will generate sub-optimal results and produce mistakes as well as discriminatory outcomes. In this review we examine the current sex and gender gaps in a subset of biomedical technologies used in relation to Precision Medicine. In addition, we provide recommendations to optimize their utilization to improve the global health and disease landscape and decrease inequalities.Entities:
Keywords: Biomarkers; Computational models; Medical ethics; Risk factors
Year: 2020 PMID: 32529043 PMCID: PMC7264169 DOI: 10.1038/s41746-020-0288-5
Source DB: PubMed Journal: NPJ Digit Med ISSN: 2398-6352
Fig. 1The key determinants of health.
Health and wellbeing of individuals and communities are influenced by several factors, which include the person’s individual characteristics and behaviours and the socio-economic, and physical environment, according to the World Health Organization (WHO) (www.who.int/hia/evidence/doh/en/). Sex and gender differences interact with the whole spectrum of health determinants.
Fig. 2Desirable and undesirable biases in artificial intelligence for health.
Fair data generation and explainable algorithms are fundamental requirements for the design and application of artificial intelligence to optimize for health and wellbeing across the sex and gender spectrum. This will facilitate the reduction of undesirable biases that propagate inequity and discrimination, and will promote desirable differentiations that help develop Precision Medicine.
Illustrative examples of clinical conditions and studies in which desirable biases would be beneficial for both basic and clinical research as well as diagnosis and treatment.
| Clinical conditions and studies | Current status without the desirable bias | Utility of the desirable bias |
|---|---|---|
| Autistic spectrum disorder | There is a current lack of consideration of the demonstrated age-dependent sex differences in the symptomatology related with impairments in social communication and interaction, expressive behaviour, reciprocal conversation, non-verbal gestures for diagnostic purposes[ | Differential diagnostic criteria for males and females could facilitate the identification of the clinical diagnosis leading to appropriate treatment. |
| Cardiovascular disorders | Although it has been documented that men and women respond differently to many cardiovascular medications such as statins, angiotensin-converting enzyme inhibitors and β-Blockers among others, adopted treatments do not consider sex differences[ | Making prescriptions according to the sex of the patient could lead to improved health benefits. |
| Despite the fact that Coronary heart disease (CHD) is the leading cause of death among women[ | The application of a desirable bias towards women would lead to a more accurate representation of sex differences in clinical research. | |
| Genome-wide association studies (GWAS) | Most of genome-wide association studies (GWAS) focus on white male subjects[ | The introduction of desirable biases to deliberately include female subjects and other ethnicities in GWAS could lead to better account for potential sex differences in disease that are currently unknown because of being overlooked. |
| Human immunodeficiency virus (HIV) | The observed lower female representation in HIV clinical trials depends, among other factors, from the disadvantaged awareness about treatment and enrolment options compared with men[ | Promoting empowerment initiatives in those patients with disadvantages will increase their exposure to treatment options and clinical trial enrolment. |
Source of undesirable bias in Artificial Intelligence with examples in health research and practice.
| Source of bias in artificial intelligence | Description |
|---|---|
| Historical bias | Arises even if the data is perfectly measured and sampled, when the world as it is leads a model to produce outcomes that are not desired. e.g. incorrectly assuming that HIV is inherently linked to homosexual and bisexual men as its prevalence is higher in this population[ |
| Representation bias | Occurs when certain parts of the input space are underrepresented. e.g. European male populations are the primary focus in genomics research and its derived clinical findings, neglecting other ethnicities and populations[ |
| Measurement bias | Occurs when measured data are often proxies for some ideal features and labels. e.g. the use of clinical, social, and cognitive variables to detect the prodromal phase in schizophrenia and other psychotic disorders despite of observed sex differences in the expression of those symptoms and their associated risk for psychosis[ |
| Aggregation bias | Arises when a one-size-fits-all model is used for groups with different conditional distributions. e.g., for the diagnosis and monitoring of diabetes, haemoglobin A1c (HbA1c) levels are routinely used, despite of differences associated with ethnicities[ |
| Evaluation bias | Occurs when the evaluation and/or benchmark data for an algorithm does not represent the target population. e.g. underperformance of commercial facial recognition algorithm in dark-skinned female faces as most benchmark face image datasets come from white men[ |
| Algorithmic Bias | Occurs when bias is introduced in the algorithm consciously or unconsciously in ad-hoc solutions. e.g. by using health care cost as a proxy feature for health status without correcting for existing inequalities in health access, a commercial algorithm to predict health care needs was found to exhibit significant racial discrimination[ |
Categories of digital biomarkers.
| Category | Definition | Corresponding digital biomarker examples |
|---|---|---|
| Susceptibility and risk biomarker | A biomarker that indicates the potential for developing a disease or medical condition in an individual who does not currently have clinically apparent disease or medical condition. | aDetect cognitive changes in healthy subjects at risk of developing Alzheimer’s disease using a video game platform[ |
| Diagnostic biomarker | A biomarker used to detect or confirm the presence of a disease or condition of interest or to identify individuals with a subtype of the disease. | aDiagnose ADHD in children using eye vergence metrics[ aDetect depression and Parkinson’s disease using vocal biomarkers[ aDiagnose asthma and respiratory infections using smartphone-recorded cough sounds[ |
| Monitoring biomarker | A biomarker measured serially for assessing the status of a disease or medical condition or for evidence of exposure to (or effect of) a medical product or an environmental agent. | aQuantify Parkinson’s disease severity using smartphones and machine learning[ bTrack time and location of short-acting beta-agonist inhaler use through an attached wireless sensor[ aDetection of nocturnal scratching movements in patients with atopic dermatitis using accelerometers and recurrent neural networks[ bMeasurements of sympathetic nervous impulses at the skin and inference of parasympathetic activity from heart rate variation to detect tonic-clonic epileptic seizures and immediately alert care providers[ bPortable electrocardiogram sensor associated to a smartphone app to monitor atrial fibrillation, bradycardia, tachycardia or normal heart rhythm and inform the clinician[ aMeasure adherence in treatment of schizophrenia and bipolar disorder with an ingestible digital pill[ |
| Endpoint digital biomarkers in clinical trials | Endpoints generated by the use of mobile technologies in clinical setting. | aAccelerometer-derived motor abnormalities for use in Parkinson’s disease[ bMonitoring of multiple sclerosis patients with digital technologies by using active and passive tests (ClinicalTrials.gov Identifiers: NCT03523858; NCT02952911) bVirtual Reality Functional Capacity Assessment Tool as co-primary and secondary endpoint in schizophrenia and major depressive disorder[ |
aDigital biomarker under development (in feasibility/exploratory stages).
bDigital biomarker in use in a clinical trial or an FDA cleared/approved digital health product, or a digital health app in use not requiring approval.
Fig. 3The digital divide in access to mobile technology around the globe.
The bar plot reports how less likely a woman is to own a mobile phone than a man, according to a survey analysis on mobile ownership conducted by the Global System for Mobile Communications Association (GSMA) in low- and middle-income countries (LMIC) in 2019, by geographical area (source: GSMA “The Mobile Gender Gap Report 2020”[51]). For instance, in South Asia women are 23% less likely than men to be the owner of a mobile phone, while in Europe and Central Asia women are 1% more likely to be the owner of a mobile phone. Across LMICs (“Overall”), women are 8% less likely than men to own a mobile phone.