| Literature DB >> 34960261 |
Luca Ascari1,2, Anna Marchenkova3, Andrea Bellotti1, Stefano Lai1, Lucia Moro2, Konstantin Koshmak2, Alice Mantoan1, Michele Barsotti1, Raffaello Brondi2, Giovanni Avveduto1, Davide Sechi1, Alberto Compagno1, Pietro Avanzini3, Jonas Ambeck-Madsen4, Giovanni Vecchiato3.
Abstract
Nowadays, the growing interest in gathering physiological data and human behavior in everyday life scenarios is paralleled by an increase in wireless devices recording brain and body signals. However, the technical issues that characterize these solutions often limit the full brain-related assessments in real-life scenarios. Here we introduce the Biohub platform, a hardware/software (HW/SW) integrated wearable system for multistream synchronized acquisitions. This system consists of off-the-shelf hardware and state-of-art open-source software components, which are highly integrated into a high-tech low-cost solution, complete, yet easy to use outside conventional labs. It flexibly cooperates with several devices, regardless of the manufacturer, and overcomes the possibly limited resources of recording devices. The Biohub was validated through the characterization of the quality of (i) multistream synchronization, (ii) in-lab electroencephalographic (EEG) recordings compared with a medical-grade high-density device, and (iii) a Brain-Computer-Interface (BCI) in a real driving condition. Results show that this system can reliably acquire multiple data streams with high time accuracy and record standard quality EEG signals, becoming a valid device to be used for advanced ergonomics studies such as driving, telerehabilitation, and occupational safety.Entities:
Keywords: EEG; behavior; bio-potentials; ergonomics; wearable device
Mesh:
Year: 2021 PMID: 34960261 PMCID: PMC8707223 DOI: 10.3390/s21248167
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Multimodal GUI. A screenshot of the GUI while using the Biohub in a car experiment is shown. On the left, the user can use the menu to start the recording, load pre-recorded datasets, or control plot parameters and data sources. The central panel shows the synchronized video captured from the smartphone camera pointing at the road and the route reconstructed by the GPS module of the Android smartphone. On the right, two EEG signals and one control signal are plotted according to the selection carried out by the user.
Figure 2Top: pipeline of signals within the Biohub architecture (left); EEG headset plugin (right). Bottom: mobile configuration of the Biohub (from left to right: power bank, USB modem, embedded system based on Jetson TX2 and Orbitty carrier board in a metallic case).
Figure 3Schematic representation of the Eye and Motor tasks. Stimuli used in the SSVEP and Oddball tasks.
Figure 4Visual representation of the BCI tasks. Upper picture shows a participant from the back wearing the EEG headset of the Biohub and performing the in-car session of the experiment. Panels (A,B) show the schema of the protocols used in the in-lab and in-car sessions.
Figure 5Example of accelerometer LSL-streams coming from different devices, before initial offset compensation.
Computed offsets (in ms) between streams (the first row contains those relative to the master IMU, i.e., the Jetson TX2 in this case) averaged across sessions. The number in [] indicates the offset in terms of samples given the different sampling rates.
| Jetson | Android | gTec | openBCI | Metawear1 | Metawear2 | Metawear3 | Metawear4 | |
|---|---|---|---|---|---|---|---|---|
| jetson | 0 ± 0 | −2 ± 8 [1] | −22 ± 12 [11] | 40 ± 2 [20] | −35 ± 11 [1] | −31 ± 7 [1] | −30 ± 8 [1] | −34 ± 10 [1] |
| android | 0 ± 0 | −21 ± 5 | 33 ± 11 | −30 ± 22 | −29 ± 17 | −24 ± 20 | −32 ± 22 | |
| gTec | 0 ± 0 | 63 ± 1 | −23 ± 31 | −14 ± 20 | −9 ± 19 | −16 ± 22 | ||
| openBCI | 0 ± 0 | −68 ± 8 | −63 ± 9 | −64 ± 6 | −71 ± 13 | |||
| metawear1 | 0 ± 0 | 3 ± 3 | 4 ± 4 | −1 ± 1 | ||||
| metawear2 | 0 ± 0 | 0 ± 0 | −4 ± 2 | |||||
| metawear3 | 0 ± 0 | −4 ± 2 | ||||||
| metawear4 | 0 ± 0 |
Figure 6EEG features and corresponding statistical power computed for the eye (panels (A,B)), motor (panels (C,D)), SSVEP (panels (E,F)) and oddball tasks (G,H) to compare the performance between the Geodesic (black) and the Biohub (green) system. Shaded colors show standard deviations around mean.
Figure 7Epoch rejection rates of discarded trials in the different sections of the proving ground. The sections are color-coded based on the magnitude of the corresponding rejection rates. The arrow on the top-left corner indicates the driving direction, and the dashed markers highlight the lane-merging sections. The bottom aerial picture shows the real proportions of the proving ground.
Figure 8Average subject-specific classification accuracy varying EEG channels configuration and the number of repetitions of the menu icons.