| Literature DB >> 26678249 |
Erwei Yin1,2, Timothy Zeyl3, Rami Saab4, Dewen Hu1, Zongtan Zhou1, Tom Chau3.
Abstract
Most P300 event-related potential (ERP)-based brain-computer interface (BCI) studies focus on gaze shift-dependent BCIs, which cannot be used by people who have lost voluntary eye movement. However, the performance of visual saccade-independent P300 BCIs is generally poor. To improve saccade-independent BCI performance, we propose a bimodal P300 BCI approach that simultaneously employs auditory and tactile stimuli. The proposed P300 BCI is a vision-independent system because no visual interaction is required of the user. Specifically, we designed a direction-congruent bimodal paradigm by randomly and simultaneously presenting auditory and tactile stimuli from the same direction. Furthermore, the channels and number of trials were tailored to each user to improve online performance. With 12 participants, the average online information transfer rate (ITR) of the bimodal approach improved by 45.43% and 51.05% over that attained, respectively, with the auditory and tactile approaches individually. Importantly, the average online ITR of the bimodal approach, including the break time between selections, reached 10.77 bits/min. These findings suggest that the proposed bimodal system holds promise as a practical visual saccade-independent P300 BCI.Entities:
Keywords: Brain–computer interface; P300 event-related potentials; auditory; bimodal stimuli; tactile
Mesh:
Year: 2015 PMID: 26678249 DOI: 10.1142/S0129065716500015
Source DB: PubMed Journal: Int J Neural Syst ISSN: 0129-0657 Impact factor: 5.866