| Literature DB >> 32930674 |
Jiayi Shen1,2, Jiebin Chen3, Zequan Zheng4, Jiabin Zheng4, Zherui Liu3, Jian Song5, Sum Yi Wong4, Xiaoling Wang6, Mengqi Huang6, Po-Han Fang4, Bangsheng Jiang4, Winghei Tsang1, Zonglin He4, Taoran Liu7, Babatunde Akinwunmi8,9, Chi Chiu Wang10, Casper J P Zhang11, Jian Huang12, Wai-Kit Ming1.
Abstract
BACKGROUND: Gestational diabetes mellitus (GDM) can cause adverse consequences to both mothers and their newborns. However, pregnant women living in low- and middle-income areas or countries often fail to receive early clinical interventions at local medical facilities due to restricted availability of GDM diagnosis. The outstanding performance of artificial intelligence (AI) in disease diagnosis in previous studies demonstrates its promising applications in GDM diagnosis.Entities:
Keywords: AI; app; application; artificial intelligence; diabetes; diagnosis; disease diagnosis; gestational diabetes; innovation; maternal health care; rural; women
Year: 2020 PMID: 32930674 PMCID: PMC7525402 DOI: 10.2196/21573
Source DB: PubMed Journal: J Med Internet Res ISSN: 1438-8871 Impact factor: 5.428
Baseline characteristics.
| Demographic and clinical variables | Developmental (Training) set (N=12,304) | External validation set (N=1655) | ||||
|
| GDMa (n=2761) | NGTb (n=9543) | GDM (n=240) | NGT (n=1415) | ||
| Age (year) | 30.21 (4.42) | 28.50 (3.98) | <.001 | 32.87 (4.71) | 30.33 (4.82) | <.001 |
| Fasting Glucose Value (mmol/L) | 4.89 (0.73) | 4.37 (0.36) | <.001 | 4.74 (0.58) | 4.32 (0.28) | <.001 |
| 1-h postload plasma | 9.82 (1.84) | 7.33 (1.38) | <.001 | 10.16 (1.63) | 7.25 (1.35) | <.001 |
| 2-h postload plasma | 8.53 (1.65) | 6.47 (1.04) | <.001 | 8.63 (1.23) | 6.24 (1.02) | <.001 |
aGDM: gestational diabetes mellitus
bNGT: normal glucose tolerance
The detection performance of 9 algorithms for the internal validation dataset.
| Algorithms | Accuracy | Sensitivity | Specificity | PPVa | NPVb | Brier score | AUCc |
| SVMd | 0.849 | 0.377 | 0.985 | 0.880 | 0.845 | 0.151 | 0.766 |
| Random forest | 0.833 | 0.432 | 0.949 | 0.709 | 0.852 | 0.167 | 0.728 |
| AdaBoost | 0.860 | 0.376 | 1 | 1 | 0.847 | 0.140 | 0.763 |
| kNNe | 0.841 | 0.415 | 0.964 | 0.768 | 0.851 | 0.159 | 0.723 |
| NBf | 0.845 | 0.367 | 0.983 | 0.860 | 0.843 | 0.155 | 0.768 |
| Decision tree | 0.838 | 0.431 | 0.956 | 0.738 | 0.853 | 0.162 | 0.706 |
| LRg | 0.844 | 0.363 | 0.984 | 0.865 | 0.842 | 0.156 | 0.765 |
| XGBoosth | 0.860 | 0.377 | 1 | 1 | 0.847 | 0.140 | 0.771 |
| GBDTi | 0.860 | 0.376 | 1 | 1 | 0.847 | 0.140 | 0.772 |
aPPV: positive predictive value.
bNPV: negative predictive value.
cAUC: area under the curve.
dSVM: support vector machine.
ekNN: k-nearest neighbors.
fNB: naive Bayes.
gLR: logistic regression.
hXGBoost: eXtreme gradient boosting.
iGBDT: gradient boosting decision tree.
The detection performance of 9 algorithms for the external validation dataset.
| Algorithms | Accuracy | Sensitivity | Specificity | PPVa | NPVb | Brier score | AUCc |
| SVMd | 0.887 | 0.221 | 1 | 1 | 0.883 | 0.113 | 0.780 |
| Random forest | 0.838 | 0.263 | 0.936 | 0.409 | 0.882 | 0.162 | 0.655 |
| AdaBoost | 0.882 | 0.183 | 1 | 1 | 0.878 | 0.118 | 0.736 |
| kNNe | 0.862 | 0.254 | 0.965 | 0.550 | 0.884 | 0.138 | 0.669 |
| NBf | 0.878 | 0.263 | 0.982 | 0.716 | 0.887 | 0.122 | 0.774 |
| Decision tree | 0.841 | 0.242 | 0.942 | 0.414 | 0.880 | 0.159 | 0.614 |
| LRg | 0.877 | 0.258 | 0.983 | 0.713 | 0.887 | 0.123 | 0.769 |
| XGBoosth | 0.882 | 0.183 | 1 | 1 | 0.878 | 0.118 | 0.742 |
| GBDTi | 0.882 | 0.183 | 1 | 1 | 0.878 | 0.118 | 0.757 |
aPPV: positive predictive value.
bNPV: negative predictive value.
cAUC: area under the curve.
dSVM: support vector machine.
ekNN: k-nearest neighbors.
fNB: naive Bayes.
gLR: logistic regression.
hXGBoost: eXtreme gradient boosting.
iGBDT: gradient boosting decision tree.
Figure 1Overall area under the receiver operating characteristic curves for internal validation dataset. SVM: support vector machine; knn: k-nearest neighbors; NB: naive Bayes; LR: logistic regression; XGBoost: eXtreme gradient boosting; GBDT: gradient boosting decision tree.
Demonstration of AI application.
| Sample | Age | Fasting glucose (mmol/L) | Result with AI application |
| 1 | 35 | 5.2 | GDMa |
| 2 | 25 | 4.5 | No GDM |
| 3 | 27 | 4.8 | No GDM |
| 4 | 33 | 3.5 | No GDM |
| 5 | 37 | 5.6 | GDM |
| 6 | 30 | 4.3 | No GDM |
| 7 | 30 | 6.7 | GDM |
| 8 | 27 | 5.4 | GDM |
aGDM: gestational diabetes mellitus.
Figure 2How the AI app works.
Figure 3Structure of the app.
Figure 4Interface of the app.