| Literature DB >> 34069310 |
Edgar Bañuelos-Lozoya1, Gabriel González-Serna1, Nimrod González-Franco1, Olivia Fragoso-Diaz1, Noé Castro-Sánchez1.
Abstract
Traditional evaluation of user experience is subjective by nature, for what is sought is to use data from physiological and behavioral sensors to interpret the relationship that the user's cognitive states have with the elements of a graphical interface and interaction mechanisms. This study presents the systematic review that was developed to determine the cognitive states that are being investigated in the context of Quality of Experience (QoE)/User Experience (UX) evaluation, as well as the signals and characteristics obtained, machine learning models used, evaluation architectures proposed, and the results achieved. Twenty-nine papers published in 2014-2019 were selected from eight online sources of information, of which 24% were related to the classification of cognitive states, 17% described evaluation architectures, and 41% presented correlations between different signals, cognitive states, and QoE/UX metrics, among others. The amount of identified studies was low in comparison with cognitive state research in other contexts, such as driving or other critical activities; however, this provides a starting point to analyze and interpret states such as mental workload, confusion, and mental stress from various human signals and propose more robust QoE/UX evaluation architectures.Entities:
Keywords: QoE; UX; behavioral data; biometric sensors; cognitive states; physiological data
Mesh:
Year: 2021 PMID: 34069310 PMCID: PMC8156405 DOI: 10.3390/s21103439
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Groups of search keywords.
| Groups | Keywords |
|---|---|
| Cognitive states | cognitive states, cognitive state |
| Data | physiological, EEG, GSR, ECG, eye tracking, sensor, multimodal |
| Machine learning | machine learning, deep learning |
| User experience | user experience, UX, QoE |
Figure 1Flow diagram showing the process for papers’ selection.
Figure 2Selected papers by year.
Summary of papers with the classification of cognitive states.
| Ref. | Year | Cognitive States | Best Performing Models | No. of Subjects (Female/ Male) | Stimulus | Data |
|---|---|---|---|---|---|---|
| [ | 2016 | Confusion | RF, sensitivity 0.61, specificity 0.926 | 136 (75F/61M) | Data visualization software | Self-report, ET (with pupillometry), clicks |
| [ | 2016 | Mental workload, attention | LDA, accuracy: 92% mental workload and 86% attention | 12 (3F/9M) | Virtual maze game | Self-report, EEG, keyboard, and touch behavior |
| [ | 2016 | Mental stress | RF, click-level user-dependent f1-score 0.66; logistic classifier, session-level user-independent f1-score 0.79 | 20 (7F/13M) | Arithmetic questions software | ET (from video), clicks |
| [ | 2016 | Engagement | SVM, f1-score 0.82 | 10 (3F/7M), 10 (3F/7M), 130 (34F/96M) | Cell phone usage | 1st and 2nd studies: EEG and usage logs; 3rd study: usage logs, context, and demographic data |
| [ | 2018 | Mental workload | MLP, accuracy 93.7% | 61 (19F/42M) | Website browsing | EDA, Photoplethysmography (PPG), temperature, ECG, EEG, ET (with pupillometry) |
| [ | 2019 | Confusion | RF, accuracy range 72.6–99.1% | 29 (14F/15M) | Personal data sheets | ET, age, gender |
| [ | 2019 | Engagement (as a basis for interest detection) | kNN (k-Nearest Neighbors), average accuracy 80.3% | 4 (2F/2M) | Videos | Self-report, EEG |
Summary of research of correlations with cognitive states and QoE/UX metrics.
| Ref. | Year | Objective | No. of Subjects (Female/ Male) | Stimulus | Data |
|---|---|---|---|---|---|
| [ | 2014 | Correlations between frontal alpha EEG asymmetry, experience and task difficulty | 20 (10F/10M) | Mobile application tasks | Self-report; EEG |
| [ | 2014 | Correlations between GSR and task performance metrics | 20 (10F/10M) | Mobile application tasks | Self-report; GSR, blood volume pulse, hear rate, EEG, and respiration |
| [ | 2014 | Correlations between quality perception, brain activity, and ET metrics | 19 (11F/8M) | Videos | EEG and ET (with pupillometry) |
| [ | 2015 | QoE evaluation | 32 (5F/27M) | Online game | Self-report; EEG |
| [ | 2015 | EEG power analysis during tasks with cognitive differences | 30 (20F/10M) | Two-Picture cognitive task and video game | EEG, screen, and frontal videos |
| [ | 2015 | Flow state analysis based on engagement and arousal indices | 30 (20F/10M) | Video game | EEG, screen and frontal videos |
| [ | 2016 | Sleepiness analysis | 12 (3F/9M), 24 (8F/16M) | Videos | 1st study: self-report, EEG, electrooculogram (EOG); 2nd study: self-report, EEG, GSR, ECG, and electromyogram (EMG) |
| [ | 2017 | Cognitive load, product sorting, and users’ goal analysis | 21 (10F/11M) | Online shopping tasks | EEG |
| [ | 2017 | Correlations between ET, acceptance and perception | 10 (7F/3M) | Database creation assistant | Self-report; ET (with pupillometry), clicks, and screen video |
| [ | 2018 | Visual attention and task performance analysis | 38 (not indicated) | Online shopping tasks | ET |
| [ | 2019 | Analysis of the attitude towards a website considering visual attention, cognitive load, product type, and arithmetic complexity | 38 (17F/21M) | Online shopping tasks | Self-report; ET (with pupillometry) |
| [ | 2019 | Usability evaluation | 30 (15F/15M) | Website tasks | Self-report; screen and frontal videos, mouse and keyboard usage logs, EEG |
Figure 3Number of subjects in experiments by signal.
Figure 4Subject sex ratio in experiments by signal.