Literature DB >> 26973427

Delving into Egocentric Actions.

Yin Li1, Zhefan Ye1, James M Rehg1.   

Abstract

We address the challenging problem of recognizing the camera wearer's actions from videos captured by an egocentric camera. Egocentric videos encode a rich set of signals regarding the camera wearer, including head movement, hand pose and gaze information. We propose to utilize these mid-level egocentric cues for egocentric action recognition. We present a novel set of egocentric features and show how they can be combined with motion and object features. The result is a compact representation with superior performance. In addition, we provide the first systematic evaluation of motion, object and egocentric cues in egocentric action recognition. Our benchmark leads to several surprising findings. These findings uncover the best practices for egocentric actions, with a significant performance boost over all previous state-of-the-art methods on three publicly available datasets.

Entities:  

Year:  2015        PMID: 26973427      PMCID: PMC4784702          DOI: 10.1109/CVPR.2015.7298625

Source DB:  PubMed          Journal:  Proc IEEE Comput Soc Conf Comput Vis Pattern Recognit        ISSN: 1063-6919


  3 in total

Review 1.  In what ways do eye movements contribute to everyday activities?

Authors:  M F Land; M Hayhoe
Journal:  Vision Res       Date:  2001       Impact factor: 1.886

2.  Actions in the Eye: Dynamic Gaze Datasets and Learnt Saliency Models for Visual Recognition.

Authors:  Stefan Mathe; Cristian Sminchisescu
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2015-07       Impact factor: 6.226

3.  Facing Imbalanced Data Recommendations for the Use of Performance Metrics.

Authors:  László A Jeni; Jeffrey F Cohn; Fernando De La Torre
Journal:  Int Conf Affect Comput Intell Interact Workshops       Date:  2013
  3 in total
  5 in total

1.  Hand-Priming in Object Localization for Assistive Egocentric Vision.

Authors:  Kyungjun Lee; Abhinav Shrivastava; Hernisa Kacorri
Journal:  IEEE Winter Conf Appl Comput Vis       Date:  2020-05-14

2.  Hands Holding Clues for Object Recognition in Teachable Machines.

Authors:  Kyungjun Lee; Hernisa Kacorri
Journal:  Proc SIGCHI Conf Hum Factor Comput Syst       Date:  2019-05

3.  A Hierarchical Deep Fusion Framework for Egocentric Activity Recognition Using a Wearable Hybrid Sensor System.

Authors:  Haibin Yu; Guoxiong Pan; Mian Pan; Chong Li; Wenyan Jia; Li Zhang; Mingui Sun
Journal:  Sensors (Basel)       Date:  2019-01-28       Impact factor: 3.576

4.  Toward Shared Autonomy Control Schemes for Human-Robot Systems: Action Primitive Recognition Using Eye Gaze Features.

Authors:  Xiaoyu Wang; Alireza Haji Fathaliyan; Veronica J Santos
Journal:  Front Neurorobot       Date:  2020-10-15       Impact factor: 2.650

5.  STAC: Spatial-Temporal Attention on Compensation Information for Activity Recognition in FPV.

Authors:  Yue Zhang; Shengli Sun; Linjian Lei; Huikai Liu; Hui Xie
Journal:  Sensors (Basel)       Date:  2021-02-05       Impact factor: 3.576

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.