| Literature DB >> 30713487 |
Gabriela M Pawlowski1,2, Sujoy Ghosh-Hajra1,3, Shaun D Fickling1,3, Careesa C Liu1,3, Xiaowei Song1,3, Stephen Robinovitch2, Sam M Doesburg2, Ryan C N D'Arcy1,2,3.
Abstract
The critical need for rapid objective, physiological evaluation of brain function at point-of-care has led to the emergence of brain vital signs-a framework encompassing a portable electroencephalography (EEG) and an automated, quick test protocol. This framework enables access to well-established event-related potential (ERP) markers, which are specific to sensory, attention, and cognitive functions in both healthy and patient populations. However, all our applications to-date have used auditory stimulation, which have highlighted application challenges in persons with hearing impairments (e.g., aging, seniors, dementia). Consequently, it has become important to translate brain vital signs into a visual sensory modality. Therefore, the objectives of this study were to: 1) demonstrate the feasibility of visual brain vital signs; and 2) compare and normalize results from visual and auditory brain vital signs. Data were collected from 34 healthy adults (33 ± 13 years) using a 64-channel EEG system. Visual and auditory sequences were kept as comparable as possible to elicit the N100, P300, and N400 responses. Visual brain vital signs were elicited successfully for all three responses across the group (N100: F = 29.8380, p < 0.001; P300: F = 138.8442, p < 0.0001; N400: F = 6.8476, p = 0.01). Initial auditory-visual comparisons across the three components showed attention processing (P300) was found to be the most transferrable across modalities, with no group-level differences and correlated peak amplitudes (rho = 0.7, p = 0.0001) across individuals. Auditory P300 latencies were shorter than visual (p < 0.0001) but normalization and correlation (r = 0.5, p = 0.0033) implied a potential systematic difference across modalities. Reduced auditory N400 amplitudes compared to visual (p = 0.0061) paired with normalization and correlation across individuals (r = 0.6, p = 0.0012), also revealed potential systematic modality differences between reading and listening language comprehension. This study provides an initial understanding of the relationship between the visual and auditory sequences, while importantly establishing a visual sequence within the brain vital signs framework. With both auditory and visual stimulation capabilities available, it is possible to broaden applications across the lifespan.Entities:
Keywords: clinical assessment; electroencephalogram (EEG); event-related potentials (ERPs); neurology; point-of-care; vital signs
Year: 2019 PMID: 30713487 PMCID: PMC6346702 DOI: 10.3389/fnins.2018.00968
Source DB: PubMed Journal: Front Neurosci ISSN: 1662-453X Impact factor: 4.677
Figure 1(A) Schematic illustration of a sample of the visual stimulus sequence, containing the subjects' name, and word pairs. (B) The length of the stimuli and inter-stimulus intervals with jitter. Total sequence is around 4.6 minutes in length.
Figure 2Grand averaged waveforms for the N100 (*) and P300 (+) component in auditory (top) and visual (bottom) modalities.
Figure 3Grand averaged waveforms for the N400 (**) in the auditory (top) and visual (bottom) modalities.
Summary Statistics: Mean amplitude measures for group-level N100 and P300 (μV).
| N100 | Fz | −1.46 ± 1.84 | −4.09 ± 2.69 | −0.90 ± 2.12 | −2.60 ± 2.63 |
| Cz | −0.97 ± 1.45 | −3.42 ± 2.29 | −1.05 ± 2.20 | −2.27 ± 2.44 | |
| P300 | Fz | 0.01 ± 1.06 | 2.81 ± 2.50 | −1.22 ± 3.03 | 2.57 ± 4.62 |
| Cz | 0.22 ± 0.94 | 3.40 ± 2.34 | −0.12 ± 3.02 | 4.74 ± 4.28 | |
| Pz | 0.21 ± 0.64 | 2.42 ± 2.15 | 1.01 ± 2.10 | 5.81 ± 3.61 | |
Mean ± SD.
Summary Statistics: Mean amplitude measures for group-level N400 (μV).
| N400 | Cz | 4.26 ± 4.47 | −1.15 ± 4.62 | −1.28 ± 6.92 | −2.88 ± 7.04 |
| Pz | 2.31 ± 3.18 | −2.27 ± 3.45 | 1.58 ± 5.67 | −0.25 ± 6.13 | |
Mean ± SD.
Summary of the Effects Tests: F-ratio and p-values of all the main effects and interaction effects of mean amplitude ANOVAs.
| N100 | Stimulus | ||||
| Channel | 3.8962 | 0.0516 | 1.0253 | 0.3142 | |
| Stimulus* Channel | 0.0907 | 0.7640 | 1.3884 | 0.2420 | |
| P300 | Stimulus | ||||
| Channel | 1.7835 | 0.1717 | |||
| Stimulus* Channel | 1.4747 | 0.2323 | 0.8177 | 0.4435 | |
| N400 | Stimulus | ||||
| Channel | |||||
| Stimulus* Channel | 0.5831 | 0.4471 | 0.0327 | 0.8570 | |
Significance of <0.05 is denoted with bold text.
Summary Statistics: adjusted baseline amplitude and peak latency measures for group-level ERP characteristics at Cz.
| N100 | Amplitude (μV) | −9.17 ± 3.12 | −8.80 ± 3.26 | 0.8089 |
| Latency (ms) | ||||
| P300 | Amplitude (μV) | 8.06 ± 3.79 | 8.87 ± 2.63 | 0.5040 |
| Latency (ms) | ||||
| N400 | Amplitude (μV) | |||
| Latency (ms) |
Mean ± SD. Significance of <0.05 is denoted with bold text.
Elemental Brain Scores (EBS) measures for group-level ERP characteristics.
| N100 | Amplitude (μV) | 0.52 ± 0.17 | 0.49 ± 0.17 | 0.4491 |
| Latency (ms) | 0.49 ± 0.17 | 0.49 ± 0.17 | 0.9343 | |
| P300 | Amplitude (μV) | 0.53 ± 0.17 | 0.56 ± 0.17 | 0.1818 |
| Latency (ms) | 0.46 ± 0.16 | 0.50 ± 0.17 | 0.2629 | |
| N400 | Amplitude (μV) | 0.52 ± 0.17 | 0.47 ± 0.17 | 0.0995 |
| Latency (ms) | 0.53 ± 0.17 | 0.50 ± 0.17 | 0.4279 |
Mean ± SD within-subject elemental brain scores across modalities.
Figure 4Radar Plot of amplitude and latency EBS values for both modalities across all 3 ERP components.
Correlations of amplitude and latency measures at Cz.
| N100 | Amplitude (μV) | 0.3 | 0.1737 |
| Latency (ms) | 0.04 | 0.8470 | |
| P300 | Amplitude (μV) | ||
| Latency (ms) | |||
| N400 | Amplitude (μV) | ||
| Latency (ms) | 0.2 | 0.3135 |
Pearson r correlation coefficient used for all normally distributed data and Spearman rho used for non-parametric data, P300 amplitude.
Significance of <0.05 is denoted with bold text.
Figure 5Correlation analysis between auditory and visual adjusted baseline amplitude values for each subject. Significance of < 0.05 is denoted with *.
Figure 6Correlation analysis between auditory and visual peak latency values for each subject. Significance of < 0.05 is denoted with *.