| Literature DB >> 32399166 |
Lei Shao1, Longyu Zhang1, Abdelkader Nasreddine Belkacem2, Yiming Zhang1, Xiaoqi Chen1, Ji Li1, Hongli Liu1.
Abstract
The assistive, adaptive, and rehabilitative applications of EEG-based robot control and navigation are undergoing a major transformation in dimension as well as scope. Under the background of artificial intelligence, medical and nonmedical robots have rapidly developed and have gradually been applied to enhance the quality of people's lives. We focus on connecting the brain with a mobile home robot by translating brain signals to computer commands to build a brain-computer interface that may offer the promise of greatly enhancing the quality of life of disabled and able-bodied people by considerably improving their autonomy, mobility, and abilities. Several types of robots have been controlled using BCI systems to complete real-time simple and/or complicated tasks with high performances. In this paper, a new EEG-based intelligent teleoperation system was designed for a mobile wall-crawling cleaning robot. This robot uses crawler type instead of the traditional wheel type to be used for window or floor cleaning. For EEG-based system controlling the robot position to climb the wall and complete the tasks of cleaning, we extracted steady state visually evoked potential (SSVEP) from the collected electroencephalography (EEG) signal. The visual stimulation interface in the proposed SSVEP-based BCI was composed of four flicker pieces with different frequencies (e.g., 6 Hz, 7.5 Hz, 8.57 Hz, and 10 Hz). Seven subjects were able to smoothly control the movement directions of the cleaning robot by looking at the corresponding flicker using their brain activity. To solve the multiclass problem, thereby achieving the purpose of cleaning the wall within a short period, the canonical correlation analysis (CCA) classification algorithm had been used. Offline and online experiments were held to analyze/classify EEG signals and use them as real-time commands. The proposed system was efficient in the classification and control phases with an obtained accuracy of 89.92% and had an efficient response speed and timing with a bit rate of 22.23 bits/min. These results suggested that the proposed EEG-based clean robot system is promising for smart home control in terms of completing the tasks of cleaning the walls with efficiency, safety, and robustness.Entities:
Mesh:
Year: 2020 PMID: 32399166 PMCID: PMC7201509 DOI: 10.1155/2020/6968713
Source DB: PubMed Journal: J Healthc Eng ISSN: 2040-2295 Impact factor: 2.682
Figure 1Experimental flowchart for our proposed SSVEP EEG-based BCI for robot control system.
Figure 2EEG electrodes placement used in our experiment using Brain Products equipment.
Figure 3Flowchart of the subject performing a brain control task.
Figure 4The online experimental paradigm with respect to the path of a representative subject.
Figure 5Signal analysis and processing steps of the proposed algorithm for decoding SSVEP from the EEG signal.
Figure 6Feature extraction phase of CCA.
Figure 7Moving model of the crawler robot turning.
Coefficients for different SSVEP states.
| SSVEP state | Mean ± SD |
|---|---|
| 6 Hz | 0.40 ± 0.12 |
| 7.5 Hz | 0.44 ± 0.09 |
| 8.57 Hz | 0.50 ± 0.10 |
| 10 Hz | 0.51 ± 0.05 |
| Idle | 0.16 ± 0.04 |
Experimental statistics.
| Subject | Number of completed tasks | The average time |
|---|---|---|
| S1 | 6 | 6′25″ |
| S2 | 6 | 6′56″ |
| S3 | 6 | 6′17″ |
| S4 | 6 | 5′59″ |
| S5 | 4 | 6′00″ |
| S6 | 5 | 6′56″ |
| S7 | 4 | 7′00″ |
Data statistics of subjects.
| Subject | Accuracy (%) | ITR (bits/min) | Variance |
|---|---|---|---|
| S1 | 91.11 | 22.52 | 9.88 |
| S2 | 89.45 | 22.26 | 12.65 |
| S3 | 91.11 | 22.52 | 6.16 |
| S4 | 91.11 | 22.52 | 9.88 |
| S5 | 91.67 | 22.61 | 13.89 |
| S6 | 88.89 | 22.17 | 13.57 |
| S7 | 86.11 | 21.04 | 5.25 |
| Average | 89.92 ± 3.81 | 22.23 ± 1.19 | 10.18 ± 4.93 |
Figure 8Experimental results of our proposed BCI system for evaluating the classification accuracies, ITR, and variance value among subjects. (a) The classification accuracy for each subject. (b) Information transfer rate for each subject. (c) Variance value for each subject.