| Literature DB >> 31702562 |
Yonghao Jin1, Fei Li1, Varsha G Vimalananda2,3, Hong Yu1,2,4,5.
Abstract
BACKGROUND: Hypoglycemic events are common and potentially dangerous conditions among patients being treated for diabetes. Automatic detection of such events could improve patient care and is valuable in population studies. Electronic health records (EHRs) are valuable resources for the detection of such events.Entities:
Keywords: adverse events; convolutional neural networks; hypoglycemia; natural language processing
Year: 2019 PMID: 31702562 PMCID: PMC6913754 DOI: 10.2196/14340
Source DB: PubMed Journal: JMIR Med Inform
Figure 1Model architecture of our High-Performing System for Automatically Detecting Hypoglycemic Events (HYPE). The architecture can be divided into three parts: (1) an input layer computing word embeddings for each word, (2) a sentence embedding layer always generating sentence vectors of a fixed dimension regardless of the input sentence length, and (3) an output layer projecting the sentence vector onto a probability score for each class.
Figure 2Recurrent neural network layer with forward and backward connections. In a unidirectional setting, the backward connections (dashed lines) are absent.
Figure 3Convolutional neural network layer. Each color represents a different filter with possibly different window size. The max pooling operation produces a single signal value for each filter and the sentence vector is constructed by concatenating signal values from all filters.
Hyperparameter settings in our model.
| Hyperparameter | Optimum value | Search range |
| Learning rate | 5×10-5 | {1×10-3, 1×10-4, ..., 1×10-6} |
| Batch size | 64 | {16, 32, 64, 128, 256} |
| Sentence vector size | 300 | {100, 200, 300, 400, 500} |
| Dropout rate | 0.5 | {0.1, 0.2, 0.3, ..., 0.8} |
| Down-sampling rate | 0a | {0, 0.1, ..., 1} |
aThe optimum setting had no down-sampling.
Performance of the SVM (support vector machine) baseline and HYPE (High-Performing System for Automatically Detecting Hypoglycemic Events) based on different kinds of neural networks.
| Performance measures | SVM | LSTMb | Bi-LSTMc | TCNd | CNNe | |||||
| Precision, mean (SD) | 0.74 (0.07) | <.001 | 0.91 (0.02) | <.001 | 0.91 (0.02) | <.001 | 0.92 (0.03) | .05 | 0.96 (0.03) | N/Af |
| Recall, mean (SD) | 0.57 (0.05) | <.001 | 0.86 (0.02) | .02 | 0.87 (0.04) | .10 | 0.89 (0.04) | N/A | 0.86 (0.03) | .10 |
| F1, mean (SD) | 0.64 (0.03) | <.001 | 0.88 (0.02) | <.001 | 0.88 (0.02) | .001 | 0.90 (0.02) | .30 | 0.91 (0.02) | N/A |
| PR-AUCg | 0.745 | N/A | 0.934 | N/A | 0.942 | N/A | 0.964 | N/A | 0.966 | N/A |
| ROC-AUCh | 0.970 | N/A | 0.996 | N/A | 0.997 | N/A | 0.998 | N/A | 0.998 | N/A |
aP values are based on two-sample t tests between the performance of the system and the best-performing system; values <.05 are significant.
bLSTM: long short-term memory.
cbi-LSTM: bidirectional long short-term memory.
dTCN: temporal convolutional neural network.
eCNN: convolutional neural network.
fN/A: not applicable.
gPR-AUC: precision-recall area under the curve.
hROC-AUC: receiver operating characteristic area under the curve.
Figure 4Precision-recall (PR) and receiver operating characteristic (ROC) curves of each model. Bi-LSTM: bidirectional long short-term memory; CNN: convolutional neural network; LSTM: long short-term memory; SVM: support vector machine; TCN: temporal convolutional neural network.
Effect of down-sampling on convolutional neural network (CNN) model performance.
| Performance measures | Ratio of positive to negative training examples, mean (SD) | ||
|
| 1:1 | 1:4 | 1:9 |
| Precision | 0.46 (0.03) | 0.86 (0.04) | 0.93 (0.03) |
| Recall | 0.92 (0.02) | 0.89 (0.03) | 0.88 (0.02) |
| F1 | 0.62 (0.03) | 0.87 (0.03) | 0.91 (0.02) |
Convolutional neural network (CNN) model performance with percentage reduction in training examples.
| Performance measures | Percentage reduction in training examples, mean (SD) | ||||
|
| 5% | 10% | 20% | 40% | 80% |
| Precision | 0.81 (0.38) | 0.97 (0.02) | 0.96 (0.02) | 0.96 (0.03) | 0.95 (0.02) |
| Recall | 0.03 (0.03) | 0.43 (0.05) | 0.67 (0.03) | 0.77 (0.04) | 0.85 (0.03) |
| F1 | 0.05 (0.05) | 0.60 (0.04) | 0.79 (0.03) | 0.86 (0.03) | 0.90 (0.02) |