| Literature DB >> 33979407 |
Henry W Dong1, Caitlin Mills2, Robert T Knight1, Julia W Y Kam1,3,4.
Abstract
Mind wandering is often characterized by attention oriented away from an external task towards our internal, self-generated thoughts. This universal phenomenon has been linked to numerous disruptive functional outcomes, including performance errors and negative affect. Despite its prevalence and impact, studies to date have yet to identify robust behavioral signatures, making unobtrusive, yet reliable detection of mind wandering a difficult but important task for future applications. Here we examined whether electrophysiological measures can be used in machine learning models to accurately predict mind wandering states. We recorded scalp EEG from participants as they performed an auditory target detection task and self-reported whether they were on task or mind wandering. We successfully classified attention states both within (person-dependent) and across (person-independent) individuals using event-related potential (ERP) measures. Non-linear and linear machine learning models detected mind wandering above-chance within subjects: support vector machine (AUC = 0.715) and logistic regression (AUC = 0.635). Importantly, these models also generalized across subjects: support vector machine (AUC = 0.613) and logistic regression (AUC = 0.609), suggesting we can reliably predict a given individual's attention state based on ERP patterns observed in the group. This study is the first to demonstrate that machine learning models can generalize to "never-seen-before" individuals using electrophysiological measures, highlighting their potential for real-time prediction of covert attention states.Entities:
Year: 2021 PMID: 33979407 PMCID: PMC8115801 DOI: 10.1371/journal.pone.0251490
Source DB: PubMed Journal: PLoS One ISSN: 1932-6203 Impact factor: 3.240
Fig 1Behavioral results as a function of attention state.
(A) Mean response time was slower during mind wandering (p = 0.076). (B) No difference was observed in accuracy between the two attention states. Error bars = standard error of the mean; OT = on task; MW = mind wandering.
ANOVAs on ERP components.
| Features | Attention (OT vs. MW) | Tone (Standard vs. Target) | Attention x Tone Interaction |
|---|---|---|---|
| N1 Min | |||
| P3 Max |
Note: MW = mind wandering. OT = on task.
Repeated measures ANOVAs of main effects of attention and tone as well as attention × tone interaction, reported separately for N1 and P3 ERP components.
Fig 2Grand average ERP waveforms.
N1 was averaged across FC1, FCz and FC2 (left panel), whereas P3 was averaged across P1, Pz and P2 (right panel). Univariate analyses indicate reduced N1 in response to target tones during MW relative to OT periods. OT = on task, MW = mind wandering.
Fig 3Model performance per subject, as measured by AUC and MCC.
AUC performance for each subject for both models (SVM and logistic regression) is shown in top panel; chance is noted by the black horizontal line at 0.5. MCC performance for each subject is shown in bottom panel; chance is noted by the black horizontal line at 0. AUC = area under the curve; MCC = Matthews correlation coefficient; SVM = support vector machine.
Model evaluation of person-independent classification performance.
| Models | Performance Metrics | ||
|---|---|---|---|
| SVM with RBF Kernel | 0.591 (SD = 0.070) | 0.613 (SD = 0.085) | 0.206 (SD = 0.152) |
| Logistic Regression | 0.588 (SD = 0.093) | 0.609 (SD = 0.096) | 0.196 (SD = 0.169) |
Note: Classification performance indices, including accuracy, AUC, and MCC, are reported for both machine learning models: SVM with RBF kernel and logistic regression. AUC = area under the curve; MCC = Matthews correlation coefficient; SVM = support vector machine; RBF = radial basis function.
Confusion matrices of person-independent models.
| Actual MW | Actual Not MW | ||
|---|---|---|---|
| Pred. MW | 0.570 | 0.390 | |
| Pred. Not MW | 0.430 | 0.610 | |
| Pred. MW | 0.528 | 0.356 | |
| Pred. Not MW | 0.472 | 0.644 |
Note: Confusion matrix for each of the machine learning models: SVM with RBF kernel and logistic regression. Pred. = predicted; MW = mind wandering; SVM = support vector machine; RBF = radial basis function
Model performance for individual features.
| Features | Performance Metrics | ||
|---|---|---|---|
| 0.489 | 0.521 | 0.033 | |
| 0.322 | 0.471 | -0.084 | |
| 0.562 | 0.594 | 0.171 | |
| 0.518 | 0.567 | 0.106 | |
Note: Model performance metrics, including accuracy, AUC, and MCC, implemented separately for each individual feature of ERP components. Models for all four features were built with the SVM with RBF kernel. SD = standard deviation. AUC = area under the curve; MCC = Matthews correlation coefficient; SVM = support vector machine; RBF = radial basis function.