Literature DB >> 19146264

SUN: A Bayesian framework for saliency using natural statistics.

Lingyun Zhang1, Matthew H Tong, Tim K Marks, Honghao Shan, Garrison W Cottrell.   

Abstract

We propose a definition of saliency by considering what the visual system is trying to optimize when directing attention. The resulting model is a Bayesian framework from which bottom-up saliency emerges naturally as the self-information of visual features, and overall saliency (incorporating top-down information with bottom-up saliency) emerges as the pointwise mutual information between the features and the target when searching for a target. An implementation of our framework demonstrates that our model's bottom-up saliency maps perform as well as or better than existing algorithms in predicting people's fixations in free viewing. Unlike existing saliency measures, which depend on the statistics of the particular image being viewed, our measure of saliency is derived from natural image statistics, obtained in advance from a collection of natural images. For this reason, we call our model SUN (Saliency Using Natural statistics). A measure of saliency based on natural image statistics, rather than based on a single test image, provides a straightforward explanation for many search asymmetries observed in humans; the statistics of a single test image lead to predictions that are not consistent with these asymmetries. In our model, saliency is computed locally, which is consistent with the neuroanatomy of the early visual system and results in an efficient algorithm with few free parameters.

Entities:  

Mesh:

Year:  2008        PMID: 19146264     DOI: 10.1167/8.7.32

Source DB:  PubMed          Journal:  J Vis        ISSN: 1534-7362            Impact factor:   2.240


  61 in total

1.  Modelling eye movements in a categorical search task.

Authors:  Gregory J Zelinsky; Hossein Adeli; Yifan Peng; Dimitris Samaras
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2013-09-09       Impact factor: 6.237

2.  Memory and prediction in natural gaze control.

Authors:  Gabriel Diaz; Joseph Cooper; Mary Hayhoe
Journal:  Philos Trans R Soc Lond B Biol Sci       Date:  2013-09-09       Impact factor: 6.237

3.  Cube search, revisited.

Authors:  Xuetao Zhang; Jie Huang; Serap Yigit-Elliott; Ruth Rosenholtz
Journal:  J Vis       Date:  2015-03-16       Impact factor: 2.240

4.  Influence of scene structure and content on visual search strategies.

Authors:  Tatiana A Amor; Mirko Luković; Hans J Herrmann; José S Andrade
Journal:  J R Soc Interface       Date:  2017-07       Impact factor: 4.118

5.  Saccades to future ball location reveal memory-based prediction in a virtual-reality interception task.

Authors:  Gabriel Diaz; Joseph Cooper; Constantin Rothkopf; Mary Hayhoe
Journal:  J Vis       Date:  2013-01-16       Impact factor: 2.240

6.  More target features in visual working memory leads to poorer search guidance: evidence from contralateral delay activity.

Authors:  Joseph Schmidt; Annmarie MacNamara; Greg Hajcak Proudfit; Gregory J Zelinsky
Journal:  J Vis       Date:  2014-03-05       Impact factor: 2.240

7.  SUN: Top-down saliency using natural statistics.

Authors:  Christopher Kanan; Mathew H Tong; Lingyun Zhang; Garrison W Cottrell
Journal:  Vis cogn       Date:  2009-08-01

8.  Eye movement prediction and variability on natural video data sets.

Authors:  Michael Dorr; Eleonora Vig; Erhardt Barth
Journal:  Vis cogn       Date:  2012-03-26

9.  Influence of low-level stimulus features, task dependent factors, and spatial biases on overt visual attention.

Authors:  Sepp Kollmorgen; Nora Nortmann; Sylvia Schröder; Peter König
Journal:  PLoS Comput Biol       Date:  2010-05-20       Impact factor: 4.475

10.  The utility of modeling word identification from visual input within models of eye movements in reading.

Authors:  Klinton Bicknell; Roger Levy
Journal:  Vis cogn       Date:  2012-05-23
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.