Literature DB >> 25846580

Oscillatory signatures of crossmodal congruence effects: An EEG investigation employing a visuotactile pattern matching paradigm.

Florian Göschl1, Uwe Friese2, Jonathan Daume2, Peter König3, Andreas K Engel2.   

Abstract

Coherent percepts emerge from the accurate combination of inputs from the different sensory systems. There is an ongoing debate about the neurophysiological mechanisms of crossmodal interactions in the brain, and it has been proposed that transient synchronization of neurons might be of central importance. Oscillatory activity in lower frequency ranges (<30Hz) has been implicated in mediating long-range communication as typically studied in multisensory research. In the current study, we recorded high-density electroencephalograms while human participants were engaged in a visuotactile pattern matching paradigm and analyzed oscillatory power in the theta- (4-7Hz), alpha- (8-13Hz) and beta-bands (13-30Hz). Employing the same physical stimuli, separate tasks of the experiment either required the detection of predefined targets in visual and tactile modalities or the explicit evaluation of crossmodal stimulus congruence. Analysis of the behavioral data showed benefits for congruent visuotactile stimulus combinations. Differences in oscillatory dynamics related to crossmodal congruence within the two tasks were observed in the beta-band for crossmodal target detection, as well as in the theta-band for congruence evaluation. Contrasting ongoing activity preceding visuotactile stimulation between the two tasks revealed differences in the alpha- and beta-bands. Source reconstruction of between-task differences showed prominent involvement of premotor cortex, supplementary motor area, somatosensory association cortex and the supramarginal gyrus. These areas not only exhibited more involvement in the pre-stimulus interval for target detection compared to congruence evaluation, but were also crucially involved in post-stimulus differences related to crossmodal stimulus congruence within the detection task. These results add to the increasing evidence that low frequency oscillations are functionally relevant for integration in distributed brain networks, as demonstrated for crossmodal interactions in visuotactile pattern matching in the current study.
Copyright © 2015 Elsevier Inc. All rights reserved.

Entities:  

Keywords:  Cortical oscillations; Crossmodal congruence; Multisensory integration; Pattern matching; Visuotactile

Mesh:

Year:  2015        PMID: 25846580     DOI: 10.1016/j.neuroimage.2015.03.067

Source DB:  PubMed          Journal:  Neuroimage        ISSN: 1053-8119            Impact factor:   6.556


  11 in total

1.  Reduced frontal white matter microstructure in healthy older adults with low tactile recognition performance.

Authors:  Focko L Higgen; Hanna Braaß; Winifried Backhaus; Robert Schulz; Gui Xue; Christian Gerloff
Journal:  Sci Rep       Date:  2021-06-03       Impact factor: 4.379

2.  Oscillatory activity in auditory cortex reflects the perceptual level of audio-tactile integration.

Authors:  Michael Plöchl; Jeremy Gaston; Tim Mermagen; Peter König; W David Hairston
Journal:  Sci Rep       Date:  2016-09-20       Impact factor: 4.379

3.  Attentional Resource Allocation in Visuotactile Processing Depends on the Task, But Optimal Visuotactile Integration Does Not Depend on Attentional Resources.

Authors:  Basil Wahn; Peter König
Journal:  Front Integr Neurosci       Date:  2016-03-08

4.  Oscillatory brain activity during multisensory attention reflects activation, disinhibition, and cognitive control.

Authors:  Uwe Friese; Jonathan Daume; Florian Göschl; Peter König; Peng Wang; Andreas K Engel
Journal:  Sci Rep       Date:  2016-09-08       Impact factor: 4.379

5.  Frontal and parietal alpha oscillations reflect attentional modulation of cross-modal matching.

Authors:  Jonas Misselhorn; Uwe Friese; Andreas K Engel
Journal:  Sci Rep       Date:  2019-03-22       Impact factor: 4.379

6.  Multi-spectral oscillatory dynamics serving directed and divided attention.

Authors:  Marie C McCusker; Alex I Wiesman; Mikki D Schantell; Jacob A Eastman; Tony W Wilson
Journal:  Neuroimage       Date:  2020-05-11       Impact factor: 6.556

7.  New insights on the ventral attention network: Active suppression and involuntary recruitment during a bimodal task.

Authors:  Rodolfo Solís-Vivanco; Ole Jensen; Mathilde Bonnefond
Journal:  Hum Brain Mapp       Date:  2020-12-21       Impact factor: 5.038

8.  Crossmodal Pattern Discrimination in Humans and Robots: A Visuo-Tactile Case Study.

Authors:  Focko L Higgen; Philipp Ruppel; Michael Görner; Matthias Kerzel; Norman Hendrich; Jan Feldheim; Stefan Wermter; Jianwei Zhang; Christian Gerloff
Journal:  Front Robot AI       Date:  2020-12-23

9.  Sensory capability and information integration independently explain the cognitive status of healthy older adults.

Authors:  Jonas Misselhorn; Florian Göschl; Focko L Higgen; Friedhelm C Hummel; Christian Gerloff; Andreas K Engel
Journal:  Sci Rep       Date:  2020-12-31       Impact factor: 4.379

10.  Audition and vision share spatial attentional resources, yet attentional load does not disrupt audiovisual integration.

Authors:  Basil Wahn; Peter König
Journal:  Front Psychol       Date:  2015-07-29
View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.