OBJECTIVE: We aimed at improving group performance in a challenging visual search task via a hybrid collaborative brain-computer interface (cBCI). METHODS: Ten participants individually undertook a visual search task where a display was presented for 250 ms, and they had to decide whether a target was present or not. Local temporal correlation common spatial pattern (LTCCSP) was used to extract neural features from response- and stimulus-locked EEG epochs. The resulting feature vectors were extended by including response times and features extracted from eye movements. A classifier was trained to estimate the confidence of each group member. cBCI-assisted group decisions were then obtained using a confidence-weighted majority vote. RESULTS: Participants were combined in groups of different sizes to assess the performance of the cBCI. Results show that LTCCSP neural features, response times, and eye movement features significantly improve the accuracy of the cBCI over what we achieved with previous systems. For most group sizes, our hybrid cBCI yields group decisions that are significantly better than majority-based group decisions. CONCLUSION: The visual task considered here was much harder than a task we used in previous research. However, thanks to a range of technological enhancements, our cBCI has delivered a significant improvement over group decisions made by a standard majority vote. SIGNIFICANCE: With previous cBCIs, groups may perform better than single non-BCI users. Here, cBCI-assisted groups are more accurate than identically sized non-BCI groups. This paves the way to a variety of real-world applications of cBCIs where reducing decision errors is vital.
RCT Entities:
OBJECTIVE: We aimed at improving group performance in a challenging visual search task via a hybrid collaborative brain-computer interface (cBCI). METHODS: Ten participants individually undertook a visual search task where a display was presented for 250 ms, and they had to decide whether a target was present or not. Local temporal correlation common spatial pattern (LTCCSP) was used to extract neural features from response- and stimulus-locked EEG epochs. The resulting feature vectors were extended by including response times and features extracted from eye movements. A classifier was trained to estimate the confidence of each group member. cBCI-assisted group decisions were then obtained using a confidence-weighted majority vote. RESULTS:Participants were combined in groups of different sizes to assess the performance of the cBCI. Results show that LTCCSP neural features, response times, and eye movement features significantly improve the accuracy of the cBCI over what we achieved with previous systems. For most group sizes, our hybrid cBCI yields group decisions that are significantly better than majority-based group decisions. CONCLUSION: The visual task considered here was much harder than a task we used in previous research. However, thanks to a range of technological enhancements, our cBCI has delivered a significant improvement over group decisions made by a standard majority vote. SIGNIFICANCE: With previous cBCIs, groups may perform better than single non-BCI users. Here, cBCI-assisted groups are more accurate than identically sized non-BCI groups. This paves the way to a variety of real-world applications of cBCIs where reducing decision errors is vital.
Authors: Jianjun Meng; John Mundahl; Taylor Streitz; Kaitlin Maile; Nicholas Gulachek; Jeffrey He; Bin He Journal: IEEE Access Date: 2017-09-11 Impact factor: 3.367
Authors: Jane E Huggins; Christoph Guger; Erik Aarnoutse; Brendan Allison; Charles W Anderson; Steven Bedrick; Walter Besio; Ricardo Chavarriaga; Jennifer L Collinger; An H Do; Christian Herff; Matthias Hohmann; Michelle Kinsella; Kyuhwa Lee; Fabien Lotte; Gernot Müller-Putz; Anton Nijholt; Elmar Pels; Betts Peters; Felix Putze; Rüdiger Rupp; Gerwin Schalk; Stephanie Scott; Michael Tangermann; Paul Tubig; Thorsten Zander Journal: Brain Comput Interfaces (Abingdon) Date: 2019-12-10
Authors: Bin Gu; Minpeng Xu; Lichao Xu; Long Chen; Yufeng Ke; Kun Wang; Jiabei Tang; Dong Ming Journal: Front Neurosci Date: 2021-07-02 Impact factor: 4.677