Literature DB >> 1516359

Multielement visual tracking: attention and perceptual organization.

S Yantis1.   

Abstract

Two types of theories have been advanced to account for how attention is allocated in performing goal-directed visual tasks. According to location-based theories, visual attention is allocated to spatial locations in the image; according to object-based theories, attention is allocated to perceptual objects. Evidence for the latter view comes from experiments demonstrating the importance of perceptual grouping in selective-attention tasks. This article provides further evidence concerning the importance of perceptual organization in attending to objects. In seven experiments, observers tracked multiple randomly moving visual elements under a variety of conditions. Ten elements moved continuously about the display for several seconds; one to five of them were designated as targets before movement initiation. At the end of movement, one element was highlighted, and subjects indicated whether or not it was a target. The ease with which the elements in the target set could be perceptually grouped was systematically manipulated. In Experiments 1-3, factors that influenced the initial formation of a perceptual group were manipulated; this affected performance, but only early in practice. In Experiments 4-7, factors that influenced the maintenance of a perceptual group during motion were manipulated; this affected performance throughout practice. The results suggest that observers spontaneously grouped the target elements and directed attention toward this coherent but nonrigid virtual object. This supports object-based theories of attention and demonstrates that perceptual grouping, which is usually conceived of as a purely stimulus-driven process, can also be governed by goal-directed mechanisms.

Entities:  

Mesh:

Year:  1992        PMID: 1516359     DOI: 10.1016/0010-0285(92)90010-y

Source DB:  PubMed          Journal:  Cogn Psychol        ISSN: 0010-0285            Impact factor:   3.468


  88 in total

1.  Overt and covert object-based attention.

Authors:  Jason S McCarley; Arthur F Kramer; Matthew S Peterson
Journal:  Psychon Bull Rev       Date:  2002-12

2.  How do we track invisible objects?

Authors:  Todd S Horowitz; Randall S Birnkrant; David E Fencsik; Linda Tran; Jeremy M Wolfe
Journal:  Psychon Bull Rev       Date:  2006-06

3.  Quadrantic deficit reveals anatomical constraints on selection.

Authors:  Thomas A Carlson; George A Alvarez; Patrick Cavanagh
Journal:  Proc Natl Acad Sci U S A       Date:  2007-08-02       Impact factor: 11.205

4.  Attentional costs in multiple-object tracking.

Authors:  Michael Tombu; Adriane E Seiffert
Journal:  Cognition       Date:  2008-02-20

5.  Separating cognitive capacity from knowledge: a new hypothesis.

Authors:  Graeme S Halford; Nelson Cowan; Glenda Andrews
Journal:  Trends Cogn Sci       Date:  2007-05-01       Impact factor: 20.229

6.  Attention and non-retinotopic feature integration.

Authors:  Thomas U Otto; Haluk Öğmen; Michael H Herzog
Journal:  J Vis       Date:  2010-10-01       Impact factor: 2.240

7.  Neural measures of individual differences in selecting and tracking multiple moving objects.

Authors:  Trafton Drew; Edward K Vogel
Journal:  J Neurosci       Date:  2008-04-16       Impact factor: 6.167

8.  Hierarchical structure is employed by humans during visual motion perception.

Authors:  Johannes Bill; Hrag Pailian; Samuel J Gershman; Jan Drugowitsch
Journal:  Proc Natl Acad Sci U S A       Date:  2020-09-16       Impact factor: 11.205

9.  Automatic feature-based grouping during multiple object tracking.

Authors:  Gennady Erlikhman; Brian P Keane; Everett Mettler; Todd S Horowitz; Philip J Kellman
Journal:  J Exp Psychol Hum Percept Perform       Date:  2013-03-04       Impact factor: 3.332

10.  The role of visual working memory in attentive tracking of unique objects.

Authors:  Tal Makovski; Yuhong V Jiang
Journal:  J Exp Psychol Hum Percept Perform       Date:  2009-12       Impact factor: 3.332

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.