| Literature DB >> 32679704 |
Marta Sylvia Del Rio Guerra1, Jorge Martin-Gutierrez2.
Abstract
The ever-growing and widespread use of touch, face, full-body, and 3D mid-air gesture recognition sensors in domestic and industrial settings is serving to highlight whether interactive gestures are sufficiently inclusive, and whether or not they can be executed by all users. The purpose of this study was to analyze full-body gestures from the point of view of user experience using the Microsoft Kinect sensor, to identify which gestures are easy for individuals living with Down syndrome. With this information, app developers can satisfy Design for All (DfA) requirements by selecting suitable gestures from existing lists of gesture sets. A set of twenty full-body gestures were analyzed in this study; to do so, the research team developed an application to measure the success/failure rates and execution times of each gesture. The results show that the failure rate for gesture execution is greater than the success rate, and that there is no difference between male and female participants in terms of execution times or the successful execution of gestures. Through this study, we conclude that, in general, people living with Down syndrome are not able to perform certain full-body gestures correctly. This is a direct consequence of limitations resulting from characteristic physical and motor impairments. As a consequence, the Microsoft Kinect sensor cannot identify the gestures. It is important to remember this fact when developing gesture-based on Human Computer Interaction (HCI) applications that use the Kinect sensor as an input device when the apps are going to be used by people who have such disabilities.Entities:
Keywords: Down syndrome; Microsoft Kinect; corporal gestures; evaluation; full-body gestures; human–computer interaction; sensor; user experience; user interface
Mesh:
Year: 2020 PMID: 32679704 PMCID: PMC7411764 DOI: 10.3390/s20143930
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Physical gestures selected to study the User Interface (UI).
| Gesture Name | Icon | Visual Description | Gesture Name | Icon | Visual Description |
|---|---|---|---|---|---|
| (#1) Hold both arms next to side of body |
|
| (#11) Raise left arm high over head |
|
|
| (#2) Raise right arm out to the side at a 90° angle |
|
| (#12) Raise right arm and bend elbow |
|
|
| (#3) Raise left arm out to the side at a 90° angle |
|
| (#13) Raise left arm and bend elbow |
|
|
| (#4) A shape. Raise arms slightly to form an ‘A’. |
|
| (#14) Raise both arms and bend elbows |
|
|
| (#5) Crucifixion. Raise both arms out to the side at a 90° angle |
|
| (#15) Raise both hands to head |
|
|
| (#6) Raise right arm in front of body |
|
| (#16) Place hands on hips |
|
|
| (#7) Raise left arm in front of body |
|
| (#17) Cross arms in front of body |
|
|
| (#8) Raise arms in front of body |
|
| (#18) Raise right arm to head |
|
|
| (#9) Raise both arms high over head |
|
| (#19) Raise left arm to head |
|
|
| (#10) Raise right arm high over head |
|
| (#20) Bend forward |
|
|
Recognition rate according to the gesture type.
| Gesture Name | Accuracy (%) | Error (%) | Gesture Name | Accuracy (%) | Error (%) |
|---|---|---|---|---|---|
| Gesture (#1) | 100.00 | 0.00 | Gesture (#11) | 97.78 | 0.34 |
| Gesture (#2) | 98.10 | 1.10 | Gesture (#12) | 95.33 | 3.33 |
| Gesture (#3) | 97.69 | 1.56 | Gesture (#13) | 95.50 | 2.83 |
| Gesture (#4) | 96.58 | 2.23 | Gesture (#14) | 96.88 | 1.34 |
| Gesture (#5) | 100.00 | 0.00 | Gesture (#15) | 94.45 | 2.32 |
| Gesture (#6) | 94.89 | 3.45 | Gesture (#16) | 93.99 | 3.43 |
| Gesture (#7) | 94.50 | 2.87 | Gesture (#17) | 93.93 | 5.50 |
| Gesture (#8) | 93.12 | 3.34 | Gesture (#18) | 95.23 | 1.92 |
| Gesture (#9) | 98.24 | 0.24 | Gesture (#19) | 94.84 | 2.23 |
| Gesture (#10) | 96.45 | 0.31 | Gesture (#20) | 92.23 | 3.44 |
Figure 1Overview of the build process used to create a dataset [43].
Figure 2(a) Designing a Kinect routine; (b) Participant performing gesture 15 ‘Raise both hands to head’.
Figure 3(a) User registration; (b) list of registered users; (c) list of tasks performed.
Figure 4Entity Relationship Diagram (ERD): Database.
Figure 5Set up for gesture testing sessions.
Figure 6(a) Participant attempting to perform a gesture (1); (b) moderator demonstrating gesture to a participant.
Contingency tables for success rates of Kinect gestures.
| Gesture | No. Successful Gestures | No. Failed Gestures | No. Total Gestures | Percentage of Successful Gestures | Percentage of Failed Gestures |
|---|---|---|---|---|---|
| (#1) Hold both arms next to side of body | 22 | 21 | 43 | 51.2% * | 48.8% |
| (#2) Raise right arm out to the side at a 90° angle | 17 | 20 | 37 | 45.9% | 54.1% |
| (#3) Raise left arm out to the side at a 90° angle | 15 | 22 | 37 | 40.5% | 59.5% |
| (#4) A shape. Raise arms slightly to form an ‘A’. | 6 | 36 | 42 | 14.3% | 85.7% |
| (#5) Crucifixion. Raise both arms out to the side at a 90° angle | 22 | 19 | 41 | 53.7% * | 46.3% |
| (#6) Raise right arm in front of body | 13 | 24 | 37 | 35.1% | 64.9% |
| (#7) Raise left arm in front of body | 13 | 24 | 37 | 35.1% | 64.9% |
| (#8) Raise arms in front of body | 20 | 18 | 38 | 52.6% * | 47.4% |
| (#9) Raise both arms high over head | 20 | 20 | 40 | 50.0% * | 50.0% |
| (#10) Raise right arm high over head | 20 | 16 | 36 | 55.6% * | 44.4% |
| (#11) Raise left arm high over head | 20 | 16 | 36 | 55.6% * | 44.4% |
| (#12) Raise right arm and bend elbow | 19 | 18 | 37 | 51.4% * | 48.6% |
| (#13) Raise left arm and bend elbow | 19 | 18 | 37 | 51.4% * | 48.6% |
| (#14) Raise both arms and bend elbows | 26 | 9 | 35 | 74.3% * | 25.7% |
| (#15) Raise both hands to head | 8 | 29 | 37 | 21.6% | 78.4% |
| (#16) Place hands on hips | 10 | 27 | 37 | 27.0% | 73.0% |
| (#17) Cross arms in front of body | 17 | 21 | 38 | 44.7% | 55.3% |
| (#18) Raise right arm to head | 16 | 21 | 37 | 43.2% | 56.8% |
| (#19) Raise left arm to head | 16 | 21 | 37 | 43.2% | 56.8% |
| (#20) Bend forward | 9 | 25 | 34 | 26.5% | 73.5% |
|
|
|
|
|
|
|
Figure 7Success rate percentages by Kinect gesture.
Figure 8Success rate of Kinect gestures by gender.
Pearson’s Chi-square (χ2) Test for Kinect gestures.
| Value | gl | Asymp. Sig. | |
|---|---|---|---|
| Pearson’s Chi-Square | 56.560 * | 19 | 0.000 |
| Likelihood Ratio | 59.755 | 19 | 0.000 |
| Linear-by-Linear Association | 0.206 | 1 | 0.650 |
| No. Valid Cases | 753 | ||
* 0 cells (0.0%) have expected count less than 5. The minimum expected count is 14.81.
Contingency table.
| Gender | No. Successful Gestures | No. Failed Gestures | No. Total Gestures | Percentage of Successful Gestures | Percentage of Failed Gestures |
|---|---|---|---|---|---|
| Men | 263 | 285 | 548 | 48.0% | 52.0% |
| Women | 65 | 140 | 205 | 31.7% | 63.3% |
| Total | 328 | 425 | 753 | 43.6% | 56.4% |
Pearson’s Chi-square (χ2) Test for differences by gender.
| Value | gl | Asymp. Sig. | Exact Sig. (2-Sided) | Exact Sig. | |
|---|---|---|---|---|---|
| Pearson’s Chi-Square | 16.094 ** | 1 | 0.000 | ||
| Continuity Correction * | 15.438 | 1 | 0.000 | ||
| Likelihood Ratio | 16.440 | 1 | 0.000 | ||
| Fisher’s Exact Test | 0.000 | 0.000 | |||
| No. Valid Cases | 753 | ||||
* Computed only for a 2 × 2 table. ** 0 cells (0.0%) have expected count less than 5. The minimum expected count is 89.30.
Figure 9Gestures by execution time and gender.
Contrast statistics (Kruskal–Wallis test, grouping variable: ID_gesture).
| Execution Times for Successful Gestures | |
|---|---|
| Pearson Chi-Square | 26.488 |
| gl | 19 |
| Asymp. Sig. | 0.117 |