Literature DB >> 34071901

A Driver's Visual Attention Prediction Using Optical Flow.

Byeongkeun Kang1, Yeejin Lee2.   

Abstract

Motion in videos refers to the pattern of the apparent movement of objects, surfaces, and edges over image sequences caused by the relative movement between a camera and a scene. Motion, as well as scene appearance, are essential features to estimate a driver's visual attention allocation in computer vision. However, the fact that motion can be a crucial factor in a driver's attention estimation has not been thoroughly studied in the literature, although driver's attention prediction models focusing on scene appearance have been well studied. Therefore, in this work, we investigate the usefulness of motion information in estimating a driver's visual attention. To analyze the effectiveness of motion information, we develop a deep neural network framework that provides attention locations and attention levels using optical flow maps, which represent the movements of contents in videos. We validate the performance of the proposed motion-based prediction model by comparing it to the performance of the current state-of-art prediction models using RGB frames. The experimental results for a real-world dataset confirm our hypothesis that motion plays a role in prediction accuracy improvement, and there is a margin for accuracy improvement by using motion features.

Entities:  

Keywords:  convolutional neural networks; driver’s perception modeling; intelligent vehicle system; optical flow; visual attention estimation

Mesh:

Year:  2021        PMID: 34071901     DOI: 10.3390/s21113722

Source DB:  PubMed          Journal:  Sensors (Basel)        ISSN: 1424-8220            Impact factor:   3.576


  13 in total

1.  Segregation of object and background motion in visual area MT: effects of microstimulation on eye movements.

Authors:  R T Born; J M Groh; R Zhao; S J Lukasewycz
Journal:  Neuron       Date:  2000-06       Impact factor: 17.173

2.  Steering with or without the flow: is the retrieval of heading necessary?

Authors: 
Journal:  Trends Cogn Sci       Date:  2000-08       Impact factor: 20.229

3.  State-of-the-art in visual attention modeling.

Authors:  Ali Borji; Laurent Itti
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2013-01       Impact factor: 6.226

4.  Contextual guidance of eye movements and attention in real-world scenes: the role of global features in object search.

Authors:  Antonio Torralba; Aude Oliva; Monica S Castelhano; John M Henderson
Journal:  Psychol Rev       Date:  2006-10       Impact factor: 8.934

5.  Combining head pose and eye location information for gaze estimation.

Authors:  Roberto Valenti; Nicu Sebe; Theo Gevers
Journal:  IEEE Trans Image Process       Date:  2011-07-22       Impact factor: 10.856

6.  Consistent Video Saliency Using Local Gradient Flow Optimization and Global Refinement.

Authors:  Wenguan Wang; Jianbing Shen; Ling Shao
Journal:  IEEE Trans Image Process       Date:  2015-07-22       Impact factor: 10.856

7.  The role of features in preattentive vision: comparison of orientation, motion and color cues.

Authors:  H C Nothdurft
Journal:  Vision Res       Date:  1993-09       Impact factor: 1.886

8.  Predicting the Driver's Focus of Attention: The DR(eye)VE Project.

Authors:  Andrea Palazzi; Davide Abati; Simone Calderara; Francesco Solera; Rita Cucchiara
Journal:  IEEE Trans Pattern Anal Mach Intell       Date:  2018-06-08       Impact factor: 6.226

9.  Spatial Attention Fusion for Obstacle Detection Using MmWave Radar and Vision Sensor.

Authors:  Shuo Chang; Yifan Zhang; Fan Zhang; Xiaotong Zhao; Sai Huang; Zhiyong Feng; Zhiqing Wei
Journal:  Sensors (Basel)       Date:  2020-02-11       Impact factor: 3.576

10.  Social behavior for autonomous vehicles.

Authors:  Wilko Schwarting; Alyssa Pierson; Javier Alonso-Mora; Sertac Karaman; Daniela Rus
Journal:  Proc Natl Acad Sci U S A       Date:  2019-11-22       Impact factor: 11.205

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.