Literature DB >> 30037637

Discriminating scene categories from brain activity within 100 milliseconds.

Matthew X Lowe1, Jason Rajsic2, Susanne Ferber3, Dirk B Walther3.   

Abstract

Humans have the ability to make sense of the world around them in only a single glance. This astonishing feat requires the visual system to extract information from our environment with remarkable speed. How quickly does this process unfold across time, and what visual information contributes to our understanding of the visual world? We address these questions by directly measuring the temporal dynamics of the perception of colour photographs and line drawings of scenes with electroencephalography (EEG) during a scene-memorization task. Within a fraction of a second, event-related potentials (ERPs) show dissociable response patterns for global scene properties of content (natural versus manmade) and layout (open versus closed). Subsequent detailed analyses of within-category versus between-category discriminations found significant dissociations of basic-level scene categories (e.g., forest; city) within the first 100 msec of perception. The similarity of this neural activity with feature-based discriminations suggests low-level image statistics may be foundational for this rapid categorization. Interestingly, our results also suggest that the structure preserved in line drawings may form a primary and necessary basis for visual processing, whereas surface information may further enhance category selectivity in later-stage processing. Critically, these findings provide evidence that the distinction of both basic-level categories and global properties of scenes from neural signals occurs within 100 msec.
Copyright © 2018 Elsevier Ltd. All rights reserved.

Entities:  

Keywords:  EEG; Global; Line drawings; Perception; Time course; Visual cortex

Year:  2018        PMID: 30037637     DOI: 10.1016/j.cortex.2018.06.006

Source DB:  PubMed          Journal:  Cortex        ISSN: 0010-9452            Impact factor:   4.027


  7 in total

1.  Language Is a Unique Context for Emotion Perception.

Authors:  Cameron M Doyle; Maria Gendron; Kristen A Lindquist
Journal:  Affect Sci       Date:  2021-01-13

2.  Real-world structure facilitates the rapid emergence of scene category information in visual brain signals.

Authors:  Daniel Kaiser; Greta Häberle; Radoslaw M Cichy
Journal:  J Neurophysiol       Date:  2020-06-10       Impact factor: 2.714

3.  Reliability and Generalizability of Similarity-Based Fusion of MEG and fMRI Data in Human Ventral and Dorsal Visual Streams.

Authors:  Yalda Mohsenzadeh; Caitlin Mullin; Benjamin Lahner; Radoslaw Martin Cichy; Aude Oliva
Journal:  Vision (Basel)       Date:  2019-02-10

4.  A neural mechanism for contextualizing fragmented inputs during naturalistic vision.

Authors:  Daniel Kaiser; Jacopo Turini; Radoslaw M Cichy
Journal:  Elife       Date:  2019-10-09       Impact factor: 8.140

5.  Get Your Guidance Going: Investigating the Activation of Spatial Priors for Efficient Search in Virtual Reality.

Authors:  Julia Beitner; Jason Helbing; Dejan Draschkow; Melissa L-H Võ
Journal:  Brain Sci       Date:  2021-01-04

6.  Grounding Context in Embodied Cognitive Robotics.

Authors:  Diana Valenzo; Alejandra Ciria; Guido Schillaci; Bruno Lara
Journal:  Front Neurorobot       Date:  2022-06-15       Impact factor: 3.493

7.  Mapping the Scene and Object Processing Networks by Intracranial EEG.

Authors:  Kamil Vlcek; Iveta Fajnerova; Tereza Nekovarova; Lukas Hejtmanek; Radek Janca; Petr Jezdik; Adam Kalina; Martin Tomasek; Pavel Krsek; Jiri Hammer; Petr Marusic
Journal:  Front Hum Neurosci       Date:  2020-10-09       Impact factor: 3.169

  7 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.