Literature DB >> 29994596

Desktop Action Recognition From First-Person Point-of-View.

Minjie Cai, Feng Lu, Yue Gao.   

Abstract

Desktop action recognition from first-person view (egocentric) video is an important task due to its omnipresence in our daily life, and the ideal first-person viewing perspective for observing hand-object interactions. However, no previous research efforts have been dedicated on the benchmark of the task. In this paper, we first release a dataset of daily desktop actions recorded with a wearable camera and publish it as a benchmark for desktop action recognition. Regular desktop activities of six participants were recorded in egocentric video with a wide-angle head-mounted camera. In particular, we focus on five common desktop actions in which hands are involved. We provide original video data, action annotations at frame-level, and hand masks at pixel-level. We also propose a feature representation for the characterization of different desktop actions based on the spatial and temporal information of hands. In experiments, we illustrate the statistical information about the dataset, and evaluate the action recognition performance of different features as a baseline. The proposed method achieves promising performance for five action classes.

Entities:  

Mesh:

Year:  2018        PMID: 29994596     DOI: 10.1109/TCYB.2018.2806381

Source DB:  PubMed          Journal:  IEEE Trans Cybern        ISSN: 2168-2267            Impact factor:   11.448


  1 in total

1.  A union of deep learning and swarm-based optimization for 3D human action recognition.

Authors:  Hritam Basak; Rohit Kundu; Pawan Kumar Singh; Muhammad Fazal Ijaz; Marcin Woźniak; Ram Sarkar
Journal:  Sci Rep       Date:  2022-03-31       Impact factor: 4.996

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.