Literature DB >> 34993892

Depth from blur and grouping under inattention.

Einat Rashal1,2, Johan Wagemans3.   

Abstract

Previous studies provided evidence in support of attention operating in three-dimensional space, and the iterative and multistage nature of organizational processes in relation to attention and depth. We investigated depth perception and attentional demands in grouping organizations that contain blur as a depth cue. Contrary to previous studies, in our displays, no depth from occlusion could be implied from a shared border between groups or surfaces. To evaluate depth perception, subjective reports were collected where participants indicated which elements, blurry or sharp, they perceived as closer. To examine whether depth perception from blur can alleviate attentional demands, we used an inattention paradigm. We presented displays of grouping organizations by collinearity or color similarity that were previously found to require attention and added blur to the figure or the background elements to generate depth perception. In addition, we presented similar displays containing grouping by blur similarity as a single cue. We hypothesized that adding blur would facilitate the segmentation of element groups due to their perceived depth, which might lead to a diminished demand for attention. Our results confirmed that blur led to depth perception, and that sharp elements were perceived as closer more frequently than blurry elements. Thus, these results provide novel evidence for depth from blur in grouping where no inference of occlusion can be derived from a border. However, although the results suggest that blur information was processed under inattention, little evidence was found for decreased attentional demands for grouping processes in the presence of blur.
© 2022. The Psychonomic Society, Inc.

Entities:  

Keywords:  3D perception; Attention; Depth and shape from X; Divided Attention and Inattention; Grouping and Segmentation

Mesh:

Year:  2022        PMID: 34993892     DOI: 10.3758/s13414-021-02402-1

Source DB:  PubMed          Journal:  Atten Percept Psychophys        ISSN: 1943-3921            Impact factor:   2.199


  17 in total

Review 1.  Segmentation, attention and phenomenal visual objects.

Authors:  J Driver; G Davis; C Russell; M Turatto; E Freeman
Journal:  Cognition       Date:  2001-06

Review 2.  When does grouping happen?

Authors:  Stephen E Palmer; Joseph L Brooks; Rolf Nelson
Journal:  Acta Psychol (Amst)       Date:  2003-11

3.  Sharpness overconstancy in peripheral vision.

Authors:  S J Galvin; R P O'Shea; A M Squire; D G Govan
Journal:  Vision Res       Date:  1997-08       Impact factor: 1.886

4.  The whole is equal to the sum of its parts: a probabilistic model of grouping by proximity and similarity in regular patterns.

Authors:  Michael Kubovy; Martin van den Berg
Journal:  Psychol Rev       Date:  2008-01       Impact factor: 8.934

5.  Figure-ground segmentation can occur without attention.

Authors:  Ruth Kimchi; Mary A Peterson
Journal:  Psychol Sci       Date:  2008-07

6.  High-speed switchable lens enables the development of a volumetric stereoscopic display.

Authors:  Gordon D Love; David M Hoffman; Philip J W Hands; James Gao; Andrew K Kirby; Martin S Banks
Journal:  Opt Express       Date:  2009-08-31       Impact factor: 3.894

7.  Are blur and disparity complementary cues to depth?

Authors:  Michael S Langer; Ryan A Siciliano
Journal:  Vision Res       Date:  2014-12-04       Impact factor: 1.886

8.  Textons, the elements of texture perception, and their interactions.

Authors:  B Julesz
Journal:  Nature       Date:  1981-03-12       Impact factor: 49.962

9.  Blur and disparity are complementary cues to depth.

Authors:  Robert T Held; Emily A Cooper; Martin S Banks
Journal:  Curr Biol       Date:  2012-02-09       Impact factor: 10.834

10.  Edge-region grouping in figure-ground organization and depth perception.

Authors:  Stephen E Palmer; Joseph L Brooks
Journal:  J Exp Psychol Hum Percept Perform       Date:  2008-12       Impact factor: 3.332

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.