Literature DB >> 18831615

Mobile computation: spatiotemporal integration of the properties of objects in motion.

Patrick Cavanagh1, Alex O Holcombe, Weilun Chou.   

Abstract

We demonstrate that, as an object moves, color and motion signals from successive, widely spaced locations are integrated, but letter and digit shapes are not. The features that integrate as an object moves match those that integrate when the eyes move but the object is stationary (spatiotopic integration). We suggest that this integration is mediated by large receptive fields gated by attention and that it occurs for surface features (motion and color) that can be summed without precise alignment but not shape features (letters or digits) that require such alignment. Rapidly alternating pairs of colors and motions were presented at several locations around a circle centered at fixation. The same two stimuli alternated at each location with the phase of the alternation reversing from one location to the next. When observers attended to only one location, the stimuli alternated in both retinal coordinates and in the attended stream: feature identification was poor. When the observer's attention shifted around the circle in synchrony with the alternation, the stimuli still alternated at each location in retinal coordinates, but now attention always selected the same color and motion, with the stimulus appearing as a single unchanging object stepping across the locations. The maximum presentation rate at which the color and motion could be reported was twice that for stationary attention, suggesting (as control experiments confirmed) object-based integration of these features. In contrast, the identification of a letter or digit alternating with a mask showed no advantage for moving attention despite the fact that moving attention accessed (within the limits of precision for attentional selection) only the target and never the mask. The masking apparently leaves partial information that cannot be integrated across locations, and we speculate that for spatially defined patterns like letters, integration across large shifts in location may be limited by problems in aligning successive samples. Our results also suggest that as attention moves, the selection of any given location (dwell time) can be as short as 50 ms, far shorter than the typical dwell time for stationary attention. Moving attention can therefore sample a brief instant of a rapidly changing stream if it passes quickly through, giving access to events that are otherwise not seen.

Entities:  

Mesh:

Year:  2008        PMID: 18831615      PMCID: PMC2612738          DOI: 10.1167/8.12.1

Source DB:  PubMed          Journal:  J Vis        ISSN: 1534-7362            Impact factor:   2.240


  59 in total

1.  Role of synchrony in contour binding: some transient doubts sustained.

Authors:  Steven C Dakin; Peter J Bex
Journal:  J Opt Soc Am A Opt Image Sci Vis       Date:  2002-04       Impact factor: 2.129

2.  The shape and size of crowding for moving targets.

Authors:  Peter J Bex; Steven C Dakin; Anita J Simmers
Journal:  Vision Res       Date:  2003-12       Impact factor: 1.886

3.  Anterior inferotemporal neurons of monkeys engaged in object recognition can be highly sensitive to object retinal position.

Authors:  James J DiCarlo; John H R Maunsell
Journal:  J Neurophysiol       Date:  2003-06       Impact factor: 2.714

4.  The attentional dynamics of masked detection.

Authors:  Philip L Smith; Bradley J Wolfgang
Journal:  J Exp Psychol Hum Percept Perform       Date:  2004-02       Impact factor: 3.332

5.  Motion-based analysis of spatial patterns by the human visual system.

Authors:  Shin'ya Nishida
Journal:  Curr Biol       Date:  2004-05-25       Impact factor: 10.834

Review 6.  Attentional modulation of visual processing.

Authors:  John H Reynolds; Leonardo Chelazzi
Journal:  Annu Rev Neurosci       Date:  2004       Impact factor: 12.449

7.  A new estimation of the duration of attentional dwell time.

Authors:  Jan Theeuwes; Richard Godijn; Jay Pratt
Journal:  Psychon Bull Rev       Date:  2004-02

8.  Perceptual-binding and persistent surface segregation.

Authors:  Farshad Moradi; Shinsuke Shimojo
Journal:  Vision Res       Date:  2004-11       Impact factor: 1.886

9.  The role of attention in central and peripheral motion integration.

Authors:  David Melcher; Sofia Crespi; Aurelio Bruno; M Concetta Morrone
Journal:  Vision Res       Date:  2004-06       Impact factor: 1.886

10.  Evidence for an interruption theory of backward masking.

Authors:  T J Spencer; R Shuntich
Journal:  J Exp Psychol       Date:  1970-08
View more
  14 in total

1.  Attention and non-retinotopic feature integration.

Authors:  Thomas U Otto; Haluk Öğmen; Michael H Herzog
Journal:  J Vis       Date:  2010-10-01       Impact factor: 2.240

2.  Automatic frame-centered object representation and integration revealed by iconic memory, visual priming, and backward masking.

Authors:  Zhicheng Lin; Sheng He
Journal:  J Vis       Date:  2012-10-25       Impact factor: 2.240

3.  Short temporal asynchrony disrupts visual object recognition.

Authors:  Jedediah M Singer; Gabriel Kreiman
Journal:  J Vis       Date:  2014-05-12       Impact factor: 2.240

4.  Barrier effects in non-retinotopic feature attribution.

Authors:  Murat Aydın; Michael H Herzog; Haluk Oğmen
Journal:  Vision Res       Date:  2011-07-08       Impact factor: 1.886

5.  Perceptual learning in a nonretinotopic frame of reference.

Authors:  Thomas U Otto; Haluk Oğmen; Michael H Herzog
Journal:  Psychol Sci       Date:  2010-06-28

6.  The Geometry of Visual Perception: Retinotopic and Non-retinotopic Representations in the Human Visual System.

Authors:  Haluk Oğmen; Michael H Herzog
Journal:  Proc IEEE Inst Electr Electron Eng       Date:  2010       Impact factor: 10.961

7.  Motion and tilt aftereffects occur largely in retinal, not in object, coordinates in the Ternus-Pikler display.

Authors:  Marco Boi; Haluk Oğmen; Michael H Herzog
Journal:  J Vis       Date:  2011-03-09       Impact factor: 2.240

8.  Feature integration across space, time, and orientation.

Authors:  Thomas U Otto; Haluk Ogmen; Michael H Herzog
Journal:  J Exp Psychol Hum Percept Perform       Date:  2009-12       Impact factor: 3.332

9.  A (fascinating) litmus test for human retino- vs. non-retinotopic processing.

Authors:  Marco Boi; Haluk Oğmen; Joseph Krummenacher; Thomas U Otto; Michael H Herzog
Journal:  J Vis       Date:  2009-12-05       Impact factor: 2.240

10.  Smooth pursuit eye movements improve temporal resolution for color perception.

Authors:  Masahiko Terao; Junji Watanabe; Akihiro Yagi; Shin'ya Nishida
Journal:  PLoS One       Date:  2010-06-21       Impact factor: 3.240

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.