| Literature DB >> 35808261 |
Sreeza Tarafder1, Nasreen Badruddin1, Norashikin Yahya1, Arbi Haza Nasution2.
Abstract
Drowsiness is one of the main causes of road accidents and endangers the lives of road users. Recently, there has been considerable interest in utilizing features extracted from electroencephalography (EEG) signals to detect driver drowsiness. However, in most of the work performed in this area, the eyeblink or ocular artifacts present in EEG signals are considered noise and are removed during the preprocessing stage. In this study, we examined the possibility of extracting features from the EEG ocular artifacts themselves to perform classification between alert and drowsy states. In this study, we used the BLINKER algorithm to extract 25 blink-related features from a public dataset comprising raw EEG signals collected from 12 participants. Different machine learning classification models, including the decision tree, the support vector machine (SVM), the K-nearest neighbor (KNN) method, and the bagged and boosted tree models, were trained based on the seven selected features. These models were further optimized to improve their performance. We were able to show that features from EEG ocular artifacts are able to classify drowsy and alert states, with the optimized ensemble-boosted trees yielding the highest accuracy of 91.10% among all classic machine learning models.Entities:
Keywords: drowsiness detection; electroencephalography; ensemble learning; machine learning; ocular artifacts
Mesh:
Year: 2022 PMID: 35808261 PMCID: PMC9269018 DOI: 10.3390/s22134764
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.847
Different methods of drowsiness detection.
| Categories | Features | Measurements | Sensors |
|---|---|---|---|
| Vehicle-based | Vehicular | Steering wheel movement [ | Attached to the vehicle |
| Angular velocity [ | |||
| Acceleration [ | |||
| Lateral distance [ | |||
| Driver-based | Behavioral | PERCLOS [ | Not attached to the driver |
| PATECP [ | |||
| PATMIO/yawning [ | |||
| Blinking [ | |||
| Gaze detection [ | |||
| Head pose [ | |||
| Facial expression [ | |||
| Hand motion [ | |||
| Physiological | EEG [ | Attached to the driver | |
| ECG [ | |||
| EOG [ | |||
| EMG [ | |||
| Skin responses (GSR & PPG) [ | |||
| fNIRS [ |
Figure 1Flowchart of the research methodology.
Figure 2A schematic diagram of an eye-blink signal with the various blink landmarks used by the BLINKER software [27].
Features from BLINKER and their description.
| Feature Name | Feature Description |
|---|---|
| Duration Base (DB) | The difference between rightBase and leftBase to determine the blink length in seconds. |
| Duration Zero (DZ) | The difference between rightZero and leftZero to determine the blink length in seconds. |
| Duration Tent (DT) | The difference between rightZero and leftZero to determine the blink length in seconds. |
| Duration Half Base (DHB) | The difference between the frame defining the left-half base amplitude and the first intersection of the horizontal line drawn from the blink value at that point to the downstroke of the blink is the length of the blink in seconds. |
| Duration Half Zero (DHZ) | The difference between the frame indicating the left-half zero amplitude and the first intersection of the horizontal line drawn from the blink value at that point to the downstroke of the blink is the length of the blink in seconds. |
| Inter Blink Maximum Amplitude (IBMA) | Length of the intervals of any successive blink peaks in seconds. |
| Inter Blink Maximum Velocity Base (IBMVB) | The time in seconds between one blink’s maximum positive velocity (estimated from leftBase) and the following blink’s maximum positive velocity (calculated from leftBase). |
| Inter Blink Maximum Velocity Zero (IBMVZ) | The time in seconds between one blink’s maximum positive velocity (estimated from leftZero) and the following blink’s maximum positive velocity (calculated from leftZero). |
| Negative Amplitude Velocity Ratio Base (NAVRB) | The AVR (amplitude velocity ratio) computed using the maxBlink to rightBase interval. |
| Positive Amplitude Velocity Ratio Base (PAVRB) | The VAR computed using the leftBase to maxBlink interval. |
| Negative Amplitude Velocity Ratio Zero (NAVRZ) | The VAR computed using the maxBlink to rightZero interval. |
| Positive Amplitude Velocity Ratio Zero (PAVRZ) | The amplitude velocity ratio computed using the leftZero to maxBlink interval. |
| Negative Amplitude Velocity Ratio Tent (NAVRT) | The right tent line’s slope and the tent peak of any blink to compute the amplitude velocity ratio. |
| Positive Amplitude Velocity Ratio Tent (PAVRT) | The tent peak and slope of the left tent line to determine the amplitude velocity ratio. |
| Time Shut Base (TSB) | From the leftBase, the blink closest to 90% of its amplitude. |
| Time Shut Zero (TSZ) | From the leftZero, the blink closest to 90% of its amplitude. |
| Time Shut Tent (TST) | The blink closest to 90% of the tent peak height calculated in seconds. |
| Peak Max Blink (PMB) | The maximum amplitude of any blink. |
| Closing Time Zero (CTZ) | Difference between the maxFrame and leftZero calculated in seconds. |
| Reopening Time Zero (RTZ) | Difference between the rightZero and maxFrame calculated in seconds. |
| Closing Time Tent (CTT) | Difference calculated in seconds between the LeftxIntersect and xIntercept frames that create the tent. |
| Reopening Time Tent (RTT) | Difference calculated in seconds between the xIntersect and RightxIntercept frames that create the tent. |
| Peak Time Blink (PTB) | The maximum blink time in seconds since the beginning of the file. |
| Peak Time Tent (PTT) | Time in seconds since the beginning of the file of the tent’s peak. |
| Peak Max Blink (PMB) | Maximum blink amplitude. |
| Peak Max Tent (PMT) | Maximum tent peak height. |
The number of observations obtained from the BLINKER algorithm.
| Subjects | Number of Samples Obtained from BLINKER | |
|---|---|---|
| Alert | Drowsy | |
| Subject 1 | 0 | 94 |
| Subject 2 | 197 | 50 |
| Subject 3 | 45 | 40 |
| Subject 4 | 25 | 86 |
| Subject 5 | 48 | 47 |
| Subject 6 | 105 | 182 |
| Subject 7 | 53 | 44 |
| Subject 8 | 15 | 58 |
| Subject 9 | 182 | 264 |
| Subject 10 | 79 | 156 |
| Subject 11 | 61 | 75 |
| Subject 12 | 13 | 45 |
Figure 3Selected features using the embedded feature selection technique where the line graph above the bar shows the cumulative sum of the importance values.
Performances of the classification models.
| Model | Performance Metrics | Before Hyperparameter Tuning | After Hyperparameter Tuning | Tuned Hyperparameters and the Optimal Values |
|---|---|---|---|---|
| Decision Tree | TPR (%) | 77.60 | 76.60 |
Maximum number of splits: 211 Split criterion: Gini’s diversity index |
| FPR (%) | 18.00 | 16.80 | ||
| FNR (%) | 22.40 | 16.90 | ||
| Precision (%) | 75.80 | 76.70 | ||
| Accuracy (%) | 80.20 | 80.40 | ||
| F1 score | 0.77 | 0.77 | ||
| AUC | 0.82 | 0.82 | ||
| KNN | TPR (%) | 82.10 | 86.50 |
Number of neighbors: 3 Distance metric: Mahalanobis Distance weight: Squared inverse |
| FPR (%) | 14.50 | 12.60 | ||
| FNR (%) | 17.90 | 10.10 | ||
| Precision (%) | 80.30 | 83.10 | ||
| Accuracy (%) | 84.10 | 87.00 | ||
| F1 score | 0.81 | 0.85 | ||
| AUC | 0.90 | 0.93 | ||
| SVM | TPR (%) | 75.20 | 82.50 |
Box constraint level: 28.9228 Kernel scale: 1 Kernel function: Cubic |
| FPR (%) | 12.10 | 11.10 | ||
| FNR (%) | 16.90 | 12.50 | ||
| Precision (%) | 81.70 | 84.20 | ||
| Accuracy (%) | 82.50 | 86.20 | ||
| F1 score | 0.78 | 0.83 | ||
| AUC | 0.91 | 0.91 | ||
| Ensemble of Bagged Trees | TPR (%) | 84.00 | 84.20 |
Number of learners: 100, Maximum number of splits: 84 Number-of-Predictors-to-Sample: 8. |
| FPR (%) | 14.10 | 12.20 | ||
| FNR (%) | 11.80 | 15.40 | ||
| Precision (%) | 81.00 | 83.20 | ||
| Accuracy (%) | 85.10 | 86.40 | ||
| F1 score | 0.83 | 0.84 | ||
| AUC | 0.93 | 0.94 | ||
| Ensemble of Boosted Trees (AdaBoost) | TPR (%) | 79.70 | 91.00 | |
| FPR (%) | 21.30 | 08.80 | ||
| FNR (%) | 15.70 | 06.70 | ||
| Precision (%) | 72.80 | 88.20 | ||
| Accuracy (%) | 79.10 | 91.10 | ||
| F1 score | 0.76 | 0.90 | ||
| AUC | 0.88 | 0.97 |
Figure 4Confusion matrix of the optimized ensemble AdaBoost method.