| Literature DB >> 22346579 |
Rafael Barea1, Luciano Boquete, Jose Manuel Rodriguez-Ascariz, Sergio Ortega, Elena López.
Abstract
This paper describes a sensory system for implementing a human-computer interface based on electrooculography. An acquisition system captures electrooculograms and transmits them via the ZigBee protocol. The data acquired are analysed in real time using a microcontroller-based platform running the Linux operating system. The continuous wavelet transform and neural network are used to process and analyse the signals to obtain highly reliable results in real time. To enhance system usability, the graphical interface is projected onto special eyewear, which is also used to position the signal-capturing electrodes.Entities:
Keywords: electrooculography; eye movement; human–computer interface; wavelet transform, neural network
Mesh:
Year: 2010 PMID: 22346579 PMCID: PMC3274094 DOI: 10.3390/s110100310
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1.System architecture.
Figure 2.EOG goggles based on Vuzix Wrap 230 eyewear.
Figure 3.Electrical system diagram.
Figure 4.Image of the AM.
Figure 5.Saccadic eye-movement detection process.
Figure 6.Effect of blinking on the vertical EOG.
Figure 7.Timeline.
Figure 8.Eye-movement detection process sequence.
Figure 9.Eight-command interface.
Experiment results.
| User 1–man | 89 | 15 |
| User 2–woman | 98 | 6 |
| User 3–man | 94 | 10 |
| User 4–woman | 100 | 4 |
| User 5–man | 97 | 7 |
Figure 10.HCI errors in relation to time.