Literature DB >> 28795112

Real-time fMRI data for testing OpenNFT functionality.

Yury Koush1,2,3, John Ashburner4, Evgeny Prilepin5, Ronald Sladky6,7,8, Peter Zeidman4, Sergei Bibikov9, Frank Scharnowski6,7,8, Artem Nikonorov5,9, Dimitri Van De Ville2,3.   

Abstract

Here, we briefly describe the real-time fMRI data that is provided for testing the functionality of the open-source Python/Matlab framework for neurofeedback, termed Open NeuroFeedback Training (OpenNFT, Koush et al. [1]). The data set contains real-time fMRI runs from three anonymized participants (i.e., one neurofeedback run per participant), their structural scans and pre-selected ROIs/masks/weights. The data allows for simulating the neurofeedback experiment without an MR scanner, exploring the software functionality, and measuring data processing times on the local hardware. In accordance with the descriptions in our main article, we provide data of (1) periodically displayed (intermittent) activation-based feedback; (2) intermittent effective connectivity feedback, based on dynamic causal modeling (DCM) estimations; and (3) continuous classification-based feedback based on support-vector-machine (SVM) estimations. The data is available on our public GitHub repository: https://github.com/OpenNFT/OpenNFT_Demo/releases.

Entities:  

Keywords:  Activity; Connectivity; Multivariate pattern analysis; Neurofeedback; OpenNFT; Real-time fMRI

Year:  2017        PMID: 28795112      PMCID: PMC5547236          DOI: 10.1016/j.dib.2017.07.049

Source DB:  PubMed          Journal:  Data Brief        ISSN: 2352-3409


Specifications Table Value of the data The data allows for testing software functionality of OpenNFT and other neurofeedback software. The data allows for assessing the timing of (pre)processing steps for different feedback estimation schemes. The data can be used for testing the own neurofeedback setting.

Data

The three real-time fMRI data runs were acquired using (1) intermittent activation-based feedback; (2) intermittent effective connectivity feedback; and (3) continuous classification-based feedback. The interested reader can download the anonymized experimental data and re-run it using OpenNFT [1]. All participants gave written informed consent to participate in the experiment, which was approved by the local ethics committee. In addition to the data, we also provide files containing the OpenNFT settings, experimental protocol and experimental design modelled in SPM (http://www.fil.ion.ucl.ac.uk/spm).

Experimental design, materials and methods

Case study 1: intermittent activation-based feedback

The participant performed one fMRI localizer run to delineate bilateral primary visual cortices and a subsequent neurofeedback run to learn control over these ROIs. The localizer run consisted of eight 20 s baseline blocks that were interleaved with seven 20 s regulation blocks. During the regulation blocks, the participant was asked to perform visual-spatial imagery at a location that was indicated by a circle presented at the center of the visual field. During the baseline blocks, the participant was asked to look at the center of the screen. The neurofeedback run consisted of eight 16 s baseline blocks that were interleaved with seven 20 s regulation blocks, followed by 4 s neurofeedback display blocks. The activation-based feedback signal consisted of the scaled average activity level of the two visual ROIs. The experiment was performed at the Brain and Behavior Laboratory (University of Geneva) on a 3T MR scanner (Trio Tim, Siemens Medical Solutions, Germany). Functional images were acquired with a single-shot gradient-echo T2*-weighted EPI sequence with 150 scans (32 channel receive head coil, TR=1970 ms, volume size=74×74×36 voxels, isotropic 3 mm voxel, flip angle α=75°, bw=1572 Hz/pixel, TE=30 ms). The first five EPI volumes were discarded to account for T1 saturation effects.

Case study 2: intermittent effective connectivity feedback

The participant performed one neurofeedback run that consisted of seven trials. Each neurofeedback trial was composed of four 12 s regulation blocks interleaved with five baseline blocks of the same duration. During baseline, images of neutral objects were presented and participants were asked to passively look at them. During regulation blocks, moderately positive social images were presented and the participant was asked to experience the depicted positive social situation. The change between the conditions was indicated with the words ‘POS’ and ‘NEUT’. The feedback signal was based on a comparison of how well two alternative DCM models fitted the data acquired during a trial. The difference between the target and the alternative model dominance was computed as the difference between their logarithmic model evidence, i.e. the log of the Bayes factor (for more details, see [2], [3]). The target model represented top-down modulation from the dorsomedial prefrontal cortex (dmPFC) onto the bilateral amygdala, and the alternative model represented the bottom-up flow of information from the bilateral amygdala onto the dmPFC. The amygdala and the dmPFC ROIs were updated adaptively after each trial using an incremental GLM. The experiment was performed at the Brain and Behavior Laboratory (University of Geneva) on a 3T MR scanner (Trio Tim, Siemens Medical Solutions, Germany). Functional images were acquired with a single-shot gradient-echo T2*-weighted EPI sequence with 1050 scans (32 channel receive head coil, TR=1100 ms, volume size=120×120×18 voxels, isotropic 1.8 mm3 voxel size, flip angle α=70°, bw=1540 Hz/pixel, TE=30 ms). The first 10 EPI volumes were discarded to account for T1 saturation effects.

Case study 3: continuous classification-based feedback

The participant first performed two fMRI runs to provide data to train an SVM classifier using the PRONTO toolbox [4]. A single fMRI run consisted of seven 20 s regulation blocks that were interleaved with seven 20 s baseline blocks. The participant was asked to perform visual-spatial imagery during the regulation blocks, and to look at the center of the screen during the baseline blocks. Next, the participant was asked to perform a similar real-time fMRI run with a similar design (10 regulation and 11 baseline blocks), where feedback was provided as expanding circle placed at the center of the screen. The feedback signal was computed using the dot product between the pre-trained classifier weight vector and the current data vector extracted from the classification mask [5], [6]. The experiment was performed at the Campus Biotech (Geneva) on a 3 T MR scanner (Prisma, Siemens Medical Solutions, Germany). Functional images were acquired with a single-shot gradient-echo T2*-weighted EPI sequence with 210 scans (32 channel receive head coil, TR=2020 ms, volume size=100×100×35 voxels, isotropic 2.2 mm3 voxel size, flip angle α=74°, bw=1565 Hz/pixel, TE=28 ms). The first 5 EPI volumes were discarded to account for T1 saturation effects. For more details about the applied analyses, see the associated research article [1].
Subject areaNeurosciences
More specific subject areaNeuroimaging, Real-time fMRI, Neurofeedback
Type of dataData repository
How data was acquiredSiemens 3 T MR scanners Trio and Prisma
Data formatRaw, anonymized DICOMs, NIFTIs
Experimental factorsApproved by the local ethics committee
Experimental featuresReal-time functional MRI
Data source locationGeneva, Switzerland
Data accessibilityThe data is available under public GitHub repository:https://github.com/OpenNFT/OpenNFT_Demo/releases
  6 in total

Review 1.  Decoding fMRI brain states in real-time.

Authors:  Stephen M LaConte
Journal:  Neuroimage       Date:  2010-06-30       Impact factor: 6.556

2.  Learning Control Over Emotion Networks Through Connectivity-Based Neurofeedback.

Authors:  Yury Koush; Djalel-E Meskaldji; Swann Pichon; Gwladys Rey; Sebastian W Rieger; David E J Linden; Dimitri Van De Ville; Patrik Vuilleumier; Frank Scharnowski
Journal:  Cereb Cortex       Date:  2017-02-01       Impact factor: 5.357

3.  Real-time fMRI using brain-state classification.

Authors:  Stephen M LaConte; Scott J Peltier; Xiaoping P Hu
Journal:  Hum Brain Mapp       Date:  2007-10       Impact factor: 5.038

4.  OpenNFT: An open-source Python/Matlab framework for real-time fMRI neurofeedback training based on activity, connectivity and multivariate pattern analysis.

Authors:  Yury Koush; John Ashburner; Evgeny Prilepin; Ronald Sladky; Peter Zeidman; Sergei Bibikov; Frank Scharnowski; Artem Nikonorov; Dimitri Van De Ville
Journal:  Neuroimage       Date:  2017-06-21       Impact factor: 6.556

5.  PRoNTo: pattern recognition for neuroimaging toolbox.

Authors:  J Schrouff; M J Rosa; J M Rondina; A F Marquand; C Chu; J Ashburner; C Phillips; J Richiardi; J Mourão-Miranda
Journal:  Neuroinformatics       Date:  2013-07

6.  Connectivity-based neurofeedback: dynamic causal modeling for real-time fMRI.

Authors:  Yury Koush; Maria Joao Rosa; Fabien Robineau; Klaartje Heinen; Sebastian W Rieger; Nikolaus Weiskopf; Patrik Vuilleumier; Dimitri Van De Ville; Frank Scharnowski
Journal:  Neuroimage       Date:  2013-05-11       Impact factor: 6.556

  6 in total
  5 in total

1.  Real-time and Recursive Estimators for Functional MRI Quality Assessment.

Authors:  Nikita Davydov; Lucas Peek; Tibor Auer; Evgeny Prilepin; Nicolas Gninenko; Dimitri Van De Ville; Artem Nikonorov; Yury Koush
Journal:  Neuroinformatics       Date:  2022-03-17

2.  No time for drifting: Comparing performance and applicability of signal detrending algorithms for real-time fMRI.

Authors:  R Kopel; R Sladky; P Laub; Y Koush; F Robineau; C Hutton; N Weiskopf; P Vuilleumier; D Van De Ville; F Scharnowski
Journal:  Neuroimage       Date:  2019-02-25       Impact factor: 6.556

3.  The role of the subgenual anterior cingulate cortex in dorsomedial prefrontal-amygdala neural circuitry during positive-social emotion regulation.

Authors:  Frank Scharnowski; Andrew A Nicholson; Swann Pichon; Maria J Rosa; Gwladys Rey; Simon B Eickhoff; Dimitri Van De Ville; Patrik Vuilleumier; Yury Koush
Journal:  Hum Brain Mapp       Date:  2020-04-20       Impact factor: 5.038

4.  BrainIAK: The Brain Imaging Analysis Kit.

Authors:  Manoj Kumar; Michael J Anderson; James W Antony; Christopher Baldassano; Paula P Brooks; Ming Bo Cai; Po-Hsuan Cameron Chen; Cameron T Ellis; Gregory Henselman-Petrusek; David Huberdeau; J Benjamin Hutchinson; Y Peeta Li; Qihong Lu; Jeremy R Manning; Anne C Mennen; Samuel A Nastase; Hugo Richard; Anna C Schapiro; Nicolas W Schuck; Michael Shvartsman; Narayanan Sundaram; Daniel Suo; Javier S Turek; David Turner; Vy A Vo; Grant Wallace; Yida Wang; Jamal A Williams; Hejia Zhang; Xia Zhu; Mihai Capotă; Jonathan D Cohen; Uri Hasson; Kai Li; Peter J Ramadge; Nicholas B Turk-Browne; Theodore L Willke; Kenneth A Norman
Journal:  Apert Neuro       Date:  2022-02-16

Review 5.  Quality and denoising in real-time functional magnetic resonance imaging neurofeedback: A methods review.

Authors:  Stephan Heunis; Rolf Lamerichs; Svitlana Zinger; Cesar Caballero-Gaudes; Jacobus F A Jansen; Bert Aldenkamp; Marcel Breeuwer
Journal:  Hum Brain Mapp       Date:  2020-04-25       Impact factor: 5.038

  5 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.