| Literature DB >> 29867425 |
Stavros I Dimitriadis1,2,3,4,5, Avraam D Marimpis6.
Abstract
A brain-computer interface (BCI) is a channel of communication that transforms brain activity into specific commands for manipulating a personal computer or other home or electrical devices. In other words, a BCI is an alternative way of interacting with the environment by using brain activity instead of muscles and nerves. For that reason, BCI systems are of high clinical value for targeted populations suffering from neurological disorders. In this paper, we present a new processing approach in three publicly available BCI data sets: (a) a well-known multi-class (N = 6) coded-modulated Visual Evoked potential (c-VEP)-based BCI system for able-bodied and disabled subjects; (b) a multi-class (N = 32) c-VEP with slow and fast stimulus representation; and (c) a steady-state Visual Evoked potential (SSVEP) multi-class (N = 5) flickering BCI system. Estimating cross-frequency coupling (CFC) and namely δ-θ [δ: (0.5-4 Hz), θ: (4-8 Hz)] phase-to-amplitude coupling (PAC) within sensor and across experimental time, we succeeded in achieving high classification accuracy and Information Transfer Rates (ITR) in the three data sets. Our approach outperformed the originally presented ITR on the three data sets. The bit rates obtained for both the disabled and able-bodied subjects reached the fastest reported level of 324 bits/min with the PAC estimator. Additionally, our approach outperformed alternative signal features such as the relative power (29.73 bits/min) and raw time series analysis (24.93 bits/min) and also the original reported bit rates of 10-25 bits/min. In the second data set, we succeeded in achieving an average ITR of 124.40 ± 11.68 for the slow 60 Hz and an average ITR of 233.99 ± 15.75 for the fast 120 Hz. In the third data set, we succeeded in achieving an average ITR of 106.44 ± 8.94. Current methodology outperforms any previous methodologies applied to each of the three free available BCI datasets.Entities:
Keywords: SSVEP; accuracy; brain–computer interface; c-VEP; cross-frequency coupling; disabled subjects; performance; phase-to-amplitude coupling
Year: 2018 PMID: 29867425 PMCID: PMC5952007 DOI: 10.3389/fninf.2018.00019
Source DB: PubMed Journal: Front Neuroinform ISSN: 1662-5196 Impact factor: 4.081
Figure 1The six flashing images.
Subjects from which data were recorded in the study of the environment control system (Hoffmann et al., 2008).
| Diagnosis | Cerebral palsy | Multiple sclerosis | Late-stage amyotrophic lateral sclerosis | Traumatic brain and spinal-cord injury, C4 level |
| Age | 56 | 51 | 47 | 33 |
| Age at illness onset | 0 (perinatal) | 37 | 39 | 27 |
| Sex | M | M | M | F |
| Speech production | Mild dysarthria | Mild dysarthria | Severe dysarthria | Mild dysarthria |
| Limb muscle control | Weak | Weak | Very Weak | Weak |
| Respiration control | Normal | Normal | Weak | Normal |
| Voluntary eye movement | Normal | Mild nystagmus | Normal | Normal |
Figure 2The algorithmic steps for PAC estimation. Using the first single-trial signal from session 1 and flashing image 1 (A), from the P300 of an able subject (subject 6), we demonstrate the detection of coupling between θ and β1 rhythm. To estimate θ-β1 PAC, the raw signal was band-pass filtered into both a (B) low-frequency θ (4–8 Hz) component where its envelope is extracted as well as (C) a high-frequency β1 (13–20 Hz) component where its instantaneous phase is extracted. (D) We then extracted the amplitude and the instantaneous phase of the band-passed β1 (13–20 Hz) and filtered this amplitude time series at the same frequency as θ (4–8 Hz), giving us the θ modulation in lower β amplitude. (E) We then extracted the instantaneous phase of both the θ-filtered signal and the θ-filtered lower-β amplitude and computed the phase-locking between these two signals. The latency depended differences (F), will be used in estimating the phase-locking (iPLV) that will reflect the PAC-interaction between the two involved brain rhythms. This phase-locking represents the degree to which the lower β (β1) amplitude is co-modulated with the θ phase.
Figure 3Subject 6 (able bodied). Demonstrating the level of CFC in c-VEP responses for each flashing image. Trial-Averaged PACiPLV patterns from the c-VEP responses for each target image and for both attended vs. non-attended images.
PAC-CFC: Single-subject classification and the related bit rates for the disabled (Subjects 1–4) and able-bodied (Subjects 5–8) subjects based on the Pz sensor and the three alternative CFC-PAC estimators.
| Subject 1 | 99.91/83.03/82.44 | 385.87/230.08/226.03 |
| Subject 2 | 99.92/82.35/82.55 | 386.05/225.417/226.78 |
| Subject 3 | 99.96/82.35/82.45 | 386.84 /225.41/226.09 |
| Subject 4 | 99.95/82.11/82.41 | 386.63/223.78/225.82 |
| Subject 5 | 99.97/82.63/82.42 | 387.04/227.33/225.89 |
| Subject 6 | 99.99/82.39/83.88 | 387.48/225.69/236.02 |
| Subject 7 | 99.99/83.30/82.90 | 387.48/231.96/229.18 |
| Subject 8 | 99.99/83.45/82.85 | 387.48/233.00/228.84 |
Figure 4Subject 6 (able bodied). Demonstrating the level of CFC in c-VEP responses for each flashing image. Trial-Averaged PACPLV patterns from the c-VEP responses for each target image and for both attended vs. non-attended images.
Figure 5Subject 6 (able bodied). Demonstrating the level of CFC in c-VEP responses for each flashing image. Trial-Averaged PACMVL patterns from the c-VEP responses for each target image and for both attended vs. non-attended images.
Figure 6Subject 1(disabled). Demonstrating the level of CFC in c-VEP responses for each flashing image. Trial-Averaged PACiPLV patterns from the c-VEP responses for each target image and for both attended vs. non-attended images.
Figure 7Subject 1(disabled). Demonstrating the level of CFC in c-VEP responses for each flashing image. Trial-Averaged PACPLV patterns from the c-VEP responses for each target image and for both attended vs. non-attended images.
Figure 8Subject 1(disabled). Demonstrating the level of CFC in c-VEP responses for each flashing image. Trial-Averaged PACMVL patterns from the c-VEP responses for each target image and for both attended vs. non-attended images.
Group-averaged α1 signal relative power for attended and non-attended images.
| Able-bodied | 0.09 ± 0.02 | 0.06 ± 0.01 |
| Disabled | 0.10 ± 0.02 | 0.07 ± 0.01 |
PAC-CFC: Single-subject classification and the related bit rates for the disabled (Subjects 1–4) and able-bodied (Subjects 5–8) subjects based on the Pz sensor and the three alternative CFC-PAC estimators.
| Subject 1 | 94.63/83.03/75.45 | 323.75/ 230.08/181.62 |
| Subject 2 | 95.35/82.35/75.35 | 330.84/225.41 /181.03 |
| Subject 3 | 95.15/82.35/75.15 | 328.85/225.41/179.86 |
| Subject 4 | 96.62/82.11/75.61 | 344.00/223.78/182.57 |
| Subject 5 | 94.12/82.63/75.32 | 318.86/227.33/180.86 |
| Subject 6 | 92.51/82.39/76.09 | 304.06/225.69/185.43 |
| Subject 7 | 93.20/83.30/75.80 | 310.29/231.96/183.70 |
| Subject 8 | 95.45/83.45/75.44 | 331.85/233.00/181.57 |
α1 Relative Power: Single-subject classification and the related bit rates for the disabled (Subjects 1–4) and able-bodied (Subjects 5–8) subjects based on the Pz sensor.
| Subject 1 | 33.45 | 18.03 |
| Subject 2 | 38.45 | 29.19 |
| Subject 3 | 36.47 | 24.49 |
| Subject 4 | 35.58 | 22.50 |
| Subject 5 | 40.12 | 33.44 |
| Subject 6 | 41.23 | 36.40 |
| Subject 7 | 42.02 | 38.57 |
| Subject 8 | 40.78 | 35.18 |
α1 Raw time series: Single-subject classification and the related bit rates for the disabled (Subjects 1–4) and able-bodied (Subjects 5–8) subjects based on the Pz sensor.
| Subject 1 | 32.12 | 15.4734 |
| Subject 2 | 33.37 | 17.8800 |
| Subject 3 | 35.51 | 22.3516 |
| Subject 4 | 34.69 | 20.5863 |
| Subject 5 | 38.34 | 28.9280 |
| Subject 6 | 39.21 | 31.0982 |
| Subject 7 | 39.87 | 32.7903 |
| Subject 8 | 38.91 | 30.3421 |
PAC-CFC: Single-subject classification and the related bit rates for the 17 subjects based on the c-VEP data set in both slow and fast stimulus representation.
| Subject 1 | 99.15 ± 1.61/96.31 ± 1.56 | 136.38 ± 4.31/247.13 ± 6.45 |
| Subject 2 | 95.67 ± 1.87/95.78 ± 1.43 | 128.15 ± 4.53/227.63 ± 5.61 |
| Subject 3 | 99.14 ± 1.11/94.67 ± 1.21 | 136.99 ± 3.99/253.50 ± 5.12 |
| Subject 4 | 98.75 ± 1.23/95.01 ± 1.31 | 133.85 ± 3.69/225.98 ± 5.43 |
| Subject 5 | 96.54 ± 1.87/95.45 ± 1.47 | 129.90 ± 3.44/235.97 ± 5.42 |
| Subject 6 | 95.12 ± 1.66/96.07 ± 1.66 | 120.45 ± 4.11/230.65 ± 5.37 |
| Subject 7 | 96.41 ± 1.37/96.12 ± 1.40 | 132.02 ± 3.91/242.76 ± 5.61 |
| Subject 8 | 94.37 ± 1.48/95.88 ± 1.33 | 112.55 ± 3.77/212.20 ± 6.01 |
| Subject 9 | 93.14 ± 1.39/95.76 ± 1.41 | 97.34 ± 3.81/227.67 ± 5.14 |
| Subject 10 | 94.51 ± 1.18/96.62 ± 1.37 | 108.27 ± 3.97/211.47 ± 5.62 |
| Subject 11 | 93.27 ± 1.31/96.34 ± 1.45 | 111.96 ± 4.01/213.90 ± 5.09 |
| Subject 12 | 92.18 ± 1.48/97.01 ± 1.43 | 122.22 ± 4.31/236.45 ± 6.11 |
| Subject 13 | 97.81 ± 1.34/96.38 ± 1.20 | 134.13 ± 3.92/242.03 ± 6.34 |
| Subject 14 | 96.54 ± 1.41/95.43 ± 1.57 | 114.81 ± 3.87/222.05 ± 6.71 |
| Subject 15 | 97.78 ± 1.45/95.17 ± 1.67 | 128.58 ± 3.67/225.47 ± 6.89 |
| Subject 16 | 94.19 ± 1.32/96.07 ± 1.46 | 131.11 ± 3.77/260.92 ± 5.91 |
| Subject 17 | 96.67 ± 1.24/96.51 ± 1.57 | 136.01 ± 4.11/262.00 ± 6.72 |
Figure 9Scalp plot illustrating how many times each channel contributed to the best performance across subjects. (A) For the slow stimulation representation (60 Hz) and (B) For the fast stimulation representation (120 Hz).
PAC-CFC: Single-subject classification and the related bit rates for the 11 subjects based on the SSVEP Multi-Target Data set.
| Subject 1 | 99.15 ± 1.45 | 128.86 ± 5.61 |
| Subject 2 | 95.67 ± 1.31 | 103.10 ± 3.43 |
| Subject 3 | 99.14 ± 1.12 | 101.41 ± 3.12 |
| Subject 4 | 98.75 ± 1.44 | 109.39 ± 3.97 |
| Subject 5 | 96.54 ± 1.61 | 102.56 ± 4.01 |
| Subject 6 | 95.12 ± 1.56 | 97.38 ± 3.46 |
| Subject 7 | 96.41 ± 1.47 | 103.47 ± 3.91 |
| Subject 8 | 94.37 ± 1.78 | 113.79 ± 4.82 |
| Subject 9 | 93.14 ± 1.85 | 98.67 ± 3.79 |
| Subject 10 | 94.51 ± 1.45 | 110.04 ± 4.11 |
| Subject 11 | 93.27 ± 1.32 | 102.13 ± 4.78 |
Figure 10Scalp plot illustrating how many times each channel contributed to the best performance across subjects.