| Literature DB >> 30818796 |
Hong-Bo Zhang1,2, Yi-Xiang Zhang3,4, Bineng Zhong5,6, Qing Lei7,8, Lijie Yang9,10, Ji-Xiang Du11,12, Duan-Sheng Chen13,14.
Abstract
Although widely used in many applications, accurate and efficient human action recognition remains a challenging area of research in the field of computer vision. Most recent surveys have focused on narrow problems such as human action recognition methods using depth data, 3D-skeleton data, still image data, spatiotemporal interest point-based methods, and human walking motion recognition. However, there has been no systematic survey of human action recognition. To this end, we present a thorough review of human action recognition methods and provide a comprehensive overview of recent approaches in human action recognition research, including progress in hand-designed action features in RGB and depth data, current deep learning-based action feature representation methods, advances in human⁻object interaction recognition methods, and the current prominent research topic of action detection methods. Finally, we present several analysis recommendations for researchers. This survey paper provides an essential reference for those interested in further research on human action recognition.Entities:
Keywords: action detection; action feature; human action recognition; human–object interaction recognition; systematic survey
Mesh:
Year: 2019 PMID: 30818796 PMCID: PMC6427144 DOI: 10.3390/s19051005
Source DB: PubMed Journal: Sensors (Basel) ISSN: 1424-8220 Impact factor: 3.576
Figure 1Classification framework for human action recognition methods.
The popular dataset of human action recognition.
| Dataset Name | Color | Depth | Skeleton | Samples | Classes |
|---|---|---|---|---|---|
| Hollywood2 [ | √ | × | × | 1707 | 12 |
| HMDB51 [ | √ | × | × | 6766 | 51 |
| Olympic Sports [ | √ | × | × | 783 | 16 |
| UCF50 [ | √ | × | × | 6618 | 50 |
| UCF101 [ | √ | × | × | 13,320 | 101 |
| Kinetics [ | √ | × | × | 306,245 | 400 |
| MSR-Action3D [ | × | √ | √ | 567 | 20 |
| MSR-Daily Activity [ | √ | √ | √ | 320 | 16 |
| Northwestern-UCLA [ | √ | √ | √ | 1475 | 10 |
| UTD-MHAD [ | √ | √ | √ | 861 | 27 |
| RGBD-HuDaAct [ | √ | √ | × | 1189 | 13 |
| NTU RGB+D [ | √ | √ | √ | 56,880 | 60 |
Recognition accuracies of methods on RGB datasets. The superscript D indicates a deep learning-based method.
| Methods | Year | Hollywood2 | HMDB51 | Olympic Sports | UCF50 | UCF101 | Kinetic |
|---|---|---|---|---|---|---|---|
| [ | 2018 | 86.4% | |||||
| [ | 2018 | 81.5% | |||||
| [ | 2018 | 68.1% | 94% | ||||
| [ | 2017 | 78.7% | 97.1% | ||||
| [ | 2017 | 63.5% | 93.2% | ||||
| [ | 2017 | 75% | 95.3% | ||||
| [ | 2017 | 79% | |||||
| [ | 2017 | 74.8% | 95.8% | ||||
| [ | 2017 | 80.2% | 97.9% | ||||
| [ | 2016 | 55.2% | 85.4% | ||||
| [ | 2016 | 69.4% | 94.2% | ||||
| [ | 2016 | 66.8% | 60.1% | 90.4% | 91.7% | 86% | |
| [ | 2016 | 94.8% | 78.86% | ||||
| [ | 2016 | 65.4% | 92.5% | ||||
| [ | 2015 | 63.2% | 91.5% | ||||
| [ | 2016 | 61.7% | 88.3% | ||||
| [ | 2015 | 68% | 65.1% | 91.4% | 94.4% | 89.1% | |
| [ | 2015 | 49.9% | 85.2% | 79.5% | |||
| [ | 2015 | 70% | 61.8% | ||||
| [ | 2014 | 37.3% | 86.04% | 70.1% | |||
| [ | 2014 | 57.2% | 85.9% | ||||
| [ | 2014 | 59.4% | 88% | 81.3% | |||
| [ | 2014 | 54.4% | 41.3% | 85.5% | |||
| [ | 2013 | 27.02% | 68.20% | ||||
| [ | 2013 | 49.3% |
Recognition accuracies of methods on RGBD (Red, Green, Blue Depth) and skeleton datasets. The superscript D indicates a deep learning-based method.
| Methods | MSR-Action3D | MSR-Daily Activity | Northwestern-UCLA | UTD-MHAD | NTU RGB+D | |
|---|---|---|---|---|---|---|
| [ | 2018 | 96.2% | ||||
| [ | 2018 | 73.4% | ||||
| [ | 2018 | 30.7% | ||||
| [ | 2017 | 93.3 | 94.1% | |||
| [ | 2016 | 62.93% | ||||
| [ | 2016 | 100% | 69.2% | |||
| [ | 2015 | 100% | 81.88% | |||
| [ | 2015 | 91.2% | ||||
| [ | 2015 | 94.9% | 83.8% | |||
| [ | 2015 | 79.1% | ||||
| [ | 2014 | 93.09% | 86.25% | 31.82% | ||
| [ | 2014 | 73.1% | 81.6% | |||
| [ | 2014 | 82.30% | ||||
| [ | 2014 | 92.46% | 50.1% | |||
| [ | 2014 | 88.82% | 81.25% | |||
| [ | 2014 | 94.4% | 93.1% | |||
| [ | 2013 | 88.89% | 30.56% | |||
| [ | 2013 | 85.6% | ||||
| [ | 2013 | 91.26% | ||||
| [ | 2013 | 91.80% |