Literature DB >> 29809049

Time limits during visual foraging reveal flexible working memory templates.

Tómas Kristjánsson1, Ian M Thornton2, Árni Kristjánsson1.   

Abstract

During difficult foraging tasks, humans rarely switch between target categories, but switch frequently during easier foraging. Does this reflect fundamental limits on visual working memory (VWM) capacity or simply strategic choice due to effort? Our participants performed time-limited or unlimited foraging tasks where they tapped stimuli from 2 target categories while avoiding items from 2 distractor categories. These time limits should have no effect if capacity imposes limits on VWM representations but more flexible VWM could allow observers to use VWM according to task demands in each case. We found that with time limits, participants switched more frequently and switch-costs became much smaller than during unlimited foraging. Observers can therefore switch between complex (conjunction) target categories when needed. We propose that while maintaining many complex templates in working memory is effortful and observers avoid this, they can do so if this fits task demands, showing the flexibility of working memory representations used for visual exploration. This is in contrast with recent proposals, and we discuss the implications of these findings for theoretical accounts of working memory. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

Entities:  

Mesh:

Year:  2018        PMID: 29809049     DOI: 10.1037/xhp0000517

Source DB:  PubMed          Journal:  J Exp Psychol Hum Percept Perform        ISSN: 0096-1523            Impact factor:   3.332


  10 in total

1.  Beta and Theta Oscillations Differentially Support Free Versus Forced Control over Multiple-Target Search.

Authors:  Joram van Driel; Eduard Ort; Johannes J Fahrenfort; Christian N L Olivers
Journal:  J Neurosci       Date:  2019-01-07       Impact factor: 6.167

2.  The development of foraging organization.

Authors:  Inga María Ólafsdóttir; Steinunn Gestsdóttir; Árni Kristjánsson
Journal:  Atten Percept Psychophys       Date:  2021-06-08       Impact factor: 2.199

3.  Forty years after feature integration theory: An introduction to the special issue in honor of the contributions of Anne Treisman.

Authors:  Jeremy M Wolfe
Journal:  Atten Percept Psychophys       Date:  2020-01       Impact factor: 2.199

Review 4.  Foraging behavior in visual search: A review of theoretical and mathematical models in humans and animals.

Authors:  Marcos Bella-Fernández; Manuel Suero Suñé; Beatriz Gil-Gómez de Liaño
Journal:  Psychol Res       Date:  2021-03-21

5.  The Predation Game: Does dividing attention affect patterns of human foraging?

Authors:  Ian M Thornton; Jérôme Tagu; Sunčica Zdravković; Árni Kristjánsson
Journal:  Cogn Res Princ Implic       Date:  2021-05-06

Review 6.  Guided Search 6.0: An updated model of visual search.

Authors:  Jeremy M Wolfe
Journal:  Psychon Bull Rev       Date:  2021-02-05

7.  Can you have multiple attentional templates? Large-scale replications of Van Moorselaar, Theeuwes, and Olivers (2014) and Hollingworth and Beck (2016).

Authors:  Marcella Frătescu; Dirk Van Moorselaar; Sebastiaan Mathôt
Journal:  Atten Percept Psychophys       Date:  2019-11       Impact factor: 2.199

8.  Keeping it real: Looking beyond capacity limits in visual cognition.

Authors:  Árni Kristjánsson; Dejan Draschkow
Journal:  Atten Percept Psychophys       Date:  2021-03-31       Impact factor: 2.199

9.  Hiding the Rabbit: Using a genetic algorithm to investigate shape guidance in visual search.

Authors:  Avi M Aizenman; Krista A Ehinger; Farahnaz A Wick; Ruggero Micheletto; Jungyeon Park; Lucas Jurgensen; Jeremy M Wolfe
Journal:  J Vis       Date:  2022-01-04       Impact factor: 2.240

10.  Concurrent guidance of attention by multiple working memory items: Behavioral and computational evidence.

Authors:  Cherie Zhou; Monicque M Lorist; Sebastiaan Mathôt
Journal:  Atten Percept Psychophys       Date:  2020-08       Impact factor: 2.199

  10 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.