| Literature DB >> 28790910 |
Keum-Shik Hong1,2, Muhammad Jawad Khan1.
Abstract
In this article, non-invasive hybrid brain-computer interface (hBCI) technologies for improving classification accuracy and increasing the number of commands are reviewed. Hybridization combining more than two modalities is a new trend in brain imaging and prosthesis control. Electroencephalography (EEG), due to its easy use and fast temporal resolution, is most widely utilized in combination with other brain/non-brain signal acquisition modalities, for instance, functional near infrared spectroscopy (fNIRS), electromyography (EMG), electrooculography (EOG), and eye tracker. Three main purposes of hybridization are to increase the number of control commands, improve classification accuracy and reduce the signal detection time. Currently, such combinations of EEG + fNIRS and EEG + EOG are most commonly employed. Four principal components (i.e., hardware, paradigm, classifiers, and features) relevant to accuracy improvement are discussed. In the case of brain signals, motor imagination/movement tasks are combined with cognitive tasks to increase active brain-computer interface (BCI) accuracy. Active and reactive tasks sometimes are combined: motor imagination with steady-state evoked visual potentials (SSVEP) and motor imagination with P300. In the case of reactive tasks, SSVEP is most widely combined with P300 to increase the number of commands. Passive BCIs, however, are rare. After discussing the hardware and strategies involved in the development of hBCI, the second part examines the approaches used to increase the number of control commands and to enhance classification accuracy. The future prospects and the extension of hBCI in real-time applications for daily life scenarios are provided.Entities:
Keywords: classification accuracy; electroencephalography; electromyography; electrooculography; functional near infrared spectroscopy; hybrid brain–computer interface
Year: 2017 PMID: 28790910 PMCID: PMC5522881 DOI: 10.3389/fnbot.2017.00035
Source DB: PubMed Journal: Front Neurorobot ISSN: 1662-5218 Impact factor: 2.650
Figure 1Breakdowns of the paper.
Figure 2Purposes of hybrid brain–computer interface: (i) increase the number of control commands by combining electroencephalography (EEG) with functional near infrared spectroscopy (fNIRS) [further electrooculography (EOG)] and (ii) improve the classification accuracy by removing motion artifacts.
Combinations of devices.
| Modality combination | Sensor placement | Signal combination | Possible outcome |
|---|---|---|---|
| Electroencephalography | Brain and eyes | Electrophysiological + eye movement | Increase in control commands/increase in accuracy |
| EEG + electromyography (EMG) | Brain and muscles | Electrophysiological + electromyography | Increase in accuracy |
| EEG + functional near infrared spectroscopy (fNIRS) | Brain | Electrophysiological + hemodynamic | Increase in classification accuracy/increase in control commands |
Figure 3Electroencephalography (EEG)–electrooculography (EOG)-based brain–computer interface: the blink signals are used for switching between EEG- and EOG-based command generation, in which EEG and EOG generate P300-based commands and frown–wink–gaze-based commands, respectively.
Figure 4Electroencephalography–electromyography (EMG)-based brain–computer interface: one choice is selected using steady-state visual evoked potential (SSVEP) and muscle movement is used to change the selected option.
Figure 5Electroencephalography (EEG)-NIRS-based brain–computer interface: the figure shows a method of removal of false-positive motor imagery signals in EEG data using functional near infrared spectroscopy (fNIRS) (delayed decision).
Combinations of brain signals.
| Task 1 | Task 2 | Sensor placement | Modalities | Activity type |
|---|---|---|---|---|
| Steady-state visual evoked potential (SSVEP) | P300 | Occipital and parietal | Electroencephalography(EEG) | Reactive |
| SSVEP | Motor signals | Occipital and motor | EEG | Combination of reactive and active |
| Motor signals | P300 | Parietal and motor | EEG | Combination of active and reactive |
| P300 | Eye movement | Parietal, motor and eyes | EEG + electrooculography (EOG) | Reactive |
| Prefrontal signals | Motor signals | Prefrontal and motor | EEG + functional near infrared spectroscopy (fNIRS) (fNIRS individual in some cases) | Active for both individual fNIRS and fNIRS combined with EEG |
Important active hybrid brain–computer interface studies with applications to increased accuracy and number of commands for brain–computer interface studies (BCI) (from 2010 to 2016).
| Reference | Brain area | Activity | Modality | Application | Analysis type | Classifier | Commands | Accuracy | Window size |
|---|---|---|---|---|---|---|---|---|---|
| Li et al. ( | Whole brain | Motor imagery (MI) and P300 | Electroencephalography (EEG) + electrooculography (EOG) | Cursor control in 2D | Online | Support vector machine (SVM) | 4 | 92.8% | 0–600 ms after button flashes on the screen for 8 s |
| Allison et al. ( | Motor and occipital regions | MI and steady-state visual evoked potential (SSVEP) | EEG | Option selection from the screen | Offline | Linear discriminant analysis (LDA) | 4 | 74.8% for MI, 76.9% for SSVEP, and 81% for hybrid | 3–5 s window |
| Zhang et al. ( | Motor, parietal, and occipital regions | Mental task | EEG + EOG + electromyography (EMG) | Application to devices control | Offline | Fisher discriminant analysis combined with Mahalanobis distance | 4 | 75.3% average for two-class and 54.1% for four-class | 0–1 s |
| Su et al. ( | Whole brain | MI and P300 | EEG | Virtual environment control | Online | SVM and fisher LDA | 5 | 84.5% for MI and 81.7% for P300 | 0–2 s for MI and 0.7 s for P300 |
| Leeb et al. ( | Motor cortex | Motor execution | EEG + EMG | Application to patient motor training | Online | Bayesian | 2 | 87% for individual and 91% for hybrid case | 0.5 s for EEG and 0.3 s for EMG |
| Long et al. ( | Frontal, central, parietal, and occipital regions | P300 and MI | EEG | Direction and speed control for wheelchair | Online | LDA | 5 | 75.4% for hybrid task | 1 s |
| Yong et al. ( | Motor cortex | Hand and eye movement | EEG + EOG (eye tracker) | Artifact removal for choice selection | Online | SW-LDA | 2 | True positive rate increases from 44.7 to 73.1% (in 1 s) | 1 s |
| Fazli et al. ( | Frontal, motor, and parietal cortex | MI and Motor execution | EEG + functional near infrared spectroscopy (fNIRS) | Application to control | Offline | LDA | 2 | 93.2% (motor execution) and 83.2% (MI) | 0.75 s for EEG, 6 s prior to stimulus onset and up to 15 s after stimulus onset using 1 s sliding window for fNIRS |
| Choi and Jo ( | Whole brain | SSVEP, MI, and P300 | EEG | Humanoid robot navigation and recognition | Real time | CCA | 6 | 84.6% for P300 and 84.04% for SSVEP | 2 s |
| Cao et al. ( | Frontal, central, parietal and occipital cortex | SSVEP and MI | EEG | Brain-actuated switch for wheelchair control | Online | SVM | 8 | 90.6% | – |
| Wang et al. ( | Whole brain | MI, P300 and eye blinking | EEG + EOG | Asynchronous wheelchair control | Online | SVM | 7 | 91, 93, 89, and 92% for forward, backward, stop with special threshold, and stop with optimal threshold, respectively | 4 s |
| Khan et al. ( | Prefrontal and motor cortex | Mental arithmetic, mental counting and motor execution | EEG + fNIRS | Application to wheelchair control | Online | LDA | 4 | 94.7% for left and right movement commands (EEG) and 80.2 and 83.6% for forward and backward using fNIRS | 0–10 s for fNIRS and 0–1 s for EEG |
| Kim et al. ( | Complete brain | Eye movement | EEG + Eye tracker | Quadcopter control | Real time | SVM | 8 | 91.67% | 5 s |
| Jiang et al. ( | Motor cortex | MI and eye movement | EEG + EOG | Application to BCI control | Online | LDA | 4 | 90.4% for MI, 91.1% for relax, 96.4% for gaze left, and 97.3% for gaze right | 3 s |
| Kaiser et al. ( | Motor cortex | MI | EEG + fNIRS | Application to brain monitoring | Online | LDA | 1 | 3.6% increase in accuracy by hybrid modality | 3–7 s |
| Lorenz et al. ( | Whole brain | ERP and MI | EEG | BCI driven neuro-prosthesis | Online | LDA | 6 | Maximum selection accuracy of 98.46% and maximum confirmation accuracy of 96.26% | 1 s |
| Blokland et al. ( | Motor cortex | MI and motor execution | EEG + fNIRS | Application to tetraplegia patients | Offline | – | 2 | 87% for motor attempt and 79% for MI in tetraplegia patients | 3–15 s for fNIRS and 0–15 s for EEG |
| Bai et al. ( | Whole brain | MI and P300 | EEG | Opening, closing, selection of files on explorer | Online | SVM | 9 (can achieve 50) | >90% | 4 s window for MI and 600 m for P300 |
| Hortal et al. ( | Motor and parietal cortex | Mental imagination | EEG + EOG | Robotic arm control for pick and place task | Real time | SVM | 6 | Task 1: 71.13% and Task 2: 61.51% | 0.5 s to synchronize output to BMI |
| Hong et al. ( | Prefrontal and motor cortex | Mental arithmetic and MI | fNIRS | Applications to three choice selection | Offline | LDA | 3 | 75.6% | 2–7 s |
| Naseer and Hong ( | Prefrontal and motor cortex | Mental arithmetic, mental counting and MI | fNIRS | Decoding answers to four-choice questions | Offline | LDA | 4 | RMI, LMI, MA, and MC were correctly classified as 72.9, 64.2, 65.1, and 71.0%, respectively | 2–7 s |
| Yin et al. ( | Motor cortex | MI task | EEG + fNIRS | Increase in accuracy for BCI | Online | ELM | 2 | 88% | 0.5 s for EEG and 0–12 s for fNIRS |
| Koo et al. ( | Motor cortex | Self-paced MI | EEG + fNIRS | Application to device control | Online | SVM | 2 | 88% average accuracy | 10 s for fNIRS and three 5 s time windows with step size of 2.5 s for EEG |
| Buccino et al. ( | Motor cortex | Arm and hand movement | EEG + fNIRS | Hand movement discrimination | Online | LDA | 2 commands simultaneously | 94.2% (for rest-task classification) | 0~6 s hybrid |
| Shishkin et al. ( | Whole brain | Eye gaze | EEG + EOG | Game control | Offline | LDA | – | 90% | 0.3 s for EEG and 0.2–0.5 s for EOG |
| Khan and Hong ( | Frontal | Mental task and eye movement | NIRS + EEG | Applications to quadcopter control | Online | LDA | 8 | 76.5% for NIRS and 86% for EEG | 1 s for EEG and 2 s for NIRS |
Important passive hybrid brain–computer interface studies for drowsiness detection (from 2010 to 2016).
| Reference | Brain area | Modality | Application | Analysis type | Classifier | Commands | Accuracy (%) | Window size (s) |
|---|---|---|---|---|---|---|---|---|
| Khushaba et al. ( | Frontal and occipital | Electroencephalography (EEG) + electrooculography (EOG) + ECG | Driver drowsiness detection | Online | Linear discriminant analysis (LDA), support vector machine (SVM), K-nearest neighbor, and kernel SVM | 1 | 95−97 | 10 |
| Chen et al. ( | Frontal and occipital | EEG + EOG | Automatic detection of drowsiness | Online | ELM | 2 (single command for drowsiness) | 97.3 | 8 |
| Ahn et al. ( | Whole brain | EEG + NIRS | Mental fatigue level estimation | Online | LDA | 1 | 75.9 | 60 |
Figure 6Trend in electroencephalography (EEG)/functional near infrared spectroscopy (fNIRS)-based hybrid brain–computer interface (BCI).
Figure 7Hybrid brain–computer interface using electroencephalography (EEG) in combination with other modalities (2009–2016).
Figure 8Hybrid brain–computer interface paradigms combining different brain signals (2009–2016).
Figure 9The proposed hybrid electroencephalography (EEG)–NIRS using hemodynamic and initial dip features for simultaneous activity detection and classification.
Important reactive hybrid brain–computer interface studies (from 2010 to 2016).
| Reference | Brain area | Activity | Modality | Application | Analysis type | Classifier | Commands | Accuracy | Window size |
|---|---|---|---|---|---|---|---|---|---|
| Yin et al. ( | Parietal and occipital cortex | P300 and steady-state visual evoked potential (SSVEP) | Electroencephalography (EEG) | Speller | Online | SW-LDA | Up to 36 | 93.85% using hybrid paradigm | All rows and columns were flashed in 2.88 s |
| Zimmermann et al. ( | Motor cortex | Isometric finger-pinching task | fNIRS + bio-signals (ECG) | Feasibility for BCI | Offline | Hidden Markov model (HMM) | 1 | 88.5% | 5–20 s |
| Li et al. ( | Whole brain | SSVEP and P300 | EEG | Wheelchair control | Online | Support vector machine (SVM) | 6 | >80% | 0–0.6 s after a button flash complete for P300 and 3.2 s for SSVEP |
| Xu et al. ( | Whole brain | SSVEP and P300 | EEG | BCI speller for target selection | Online | SW-LDA | 9 | 93.3% for P300 + SSVEP-B | 0–0.8 s after the onset |
| Bi et al. ( | Parietal and occipital cortex | P300 and SSVEP | EEG | Speed and direction for cursor control | Online | SVM | 4 | >90 | 4 s |
| Aziz et al. ( | Frontal and occipital | Eye movements | EEG + electrooculography (EOG) | Automated wheelchair navigation | Online | SVM, HMM | 5 | 98% | 0.5 s |
| Li et al. ( | Motor and occipital | Motor imagery and SSVEP | EEG | Wheelchair control | Real time | SVM | 6 | − | − |
| Witkowski et al. ( | Motor cortex | Hand-grasping motion assisted with exoskeleton | EEG + EOG | Assistive rehabilitation applications | Online | Sensitivity index | 4 | Average accuracy 62.28% for two conditions | 5 s |
| Putze et al. ( | Auditory and visual cortex | Visual and auditory stimuli | EEG + functional near infrared spectroscopy (fNIRS) | Application to patient choice selection | Online | Linear discriminant analysis (LDA), SVM | 2 | 94.7% average | Four window sizes 1, 2, 4, 8, and 16 s |
| Tomita et al. ( | Visual cortex | SSVEP-based task | EEG + fNIRS | Optimal window selection for hybrid EEG–NIRS | Offline | − | 1 | 85% average accuracy (in 10 sec optimal window) | 0–10 s |
| Fan et al. ( | Parietal and occipital | SSVEP and P300 | EEG | Vehicle destination selection system | Online | LDA | 11 | 99% | 0–0.51 sec from onset for P300 and 8 s for SSVEP |
| Ma et al. ( | Parietal and occipital | P300 and eye blink | EEG + EOG | Mobile robot control | Real time | LDA | 9 | 87.3% for average of five trials | ~1.6 s |
| Combaz and Van Hulle ( | Whole brain | P300 and SSVEP | EEG | Applications to locked-in patients option selection | Online | SVM | 12 | Maximum achieved > 95% | 200 ms before stimulation to 800 ms after stimulation for experiment 1 |
| Wang et al. ( | Whole brain | P300 and SSVEP (shape changing and flickering-hybrid) | EEG | Development of new paradigm with application to devices control | Online | canonical correlation analysis (CCA), Bayesian LDA | 4 | Overall 20% increase in SSVEP classification, 100% for P300 | Flash start to the flash end for SSVEP, single flashes lasting 0.8 s for P300 |
| Ramli et al. ( | Motor and occipital | Eye gaze | EEG + EOG | Application to BCI applications (wheelchair control) | Online | Finite-state machine (FSM) | 6 | 97.88% | 0.5 s |
| Yin et al. ( | Parietal and occipital cortex | P300 and SSVEP | EEG | Speller paradigm with applications to BCI systems control | Online | SW-LDA for P300, CCA for SSVEP | Up to 64 | 95.18% | 0.8 s epochs after stimulation |
| Kim et al. ( | Occipital | SSVEP and eye movement | EEG + EOG | Turtle movement control | Online | CCA | 4 | 83% for event-related desynchronization (ERD) and 92.7% for SSVEP | 2 s |
| Lin et al. ( | Occipital | SSVEP | EEG + EMG | Choice selection | Online | CCA | 2 | 81% | 0.5–5 s |