Literature DB >> 22071564

Integration and transmission of distributed deterministic neural activity in feed-forward networks.

Yoshiyuki Asai1, Alessandro E P Villa.   

Abstract

A ten layer feed-forward network characterized by diverging/converging patterns of projection between successive layers of regular spiking (RS) neurons is activated by an external spatiotemporal input pattern fed to Layer 1 in presence of stochastic background activities fed to all layers. We used three dynamical systems to derive the external input spike trains including the temporal information, and three types of neuron models for the network, i.e. either a network formed either by neurons modeled by exponential integrate-and-fire dynamics (RS-EIF, Fourcaud-Trocmé et al., 2003), or by simple spiking neurons (RS-IZH, Izhikevich, 2004) or by multiple-timescale adaptive threshold neurons (RS-MAT, Kobayashi et al., 2009), given five intensities for the background activity. The assessment of the temporal structure embedded in the output spike trains was carried out by detecting the preferred firing sequences for the reconstruction of de-noised spike trains (Asai and Villa, 2008). We confirmed that the RS-MAT model is likely to be more efficient in integrating and transmitting the temporal structure embedded in the external input. We observed that this structure could be propagated not only up to the 10th layer but in some cases it was retained better beyond the 4th downstream layers. This study suggests that diverging/converging network structures, by the propagation of synfire activity, could play a key role in the transmission of complex temporal patterns of discharges associated to deterministic nonlinear activity. This article is part of a Special Issue entitled Neural Coding.
Copyright © 2011 Elsevier B.V. All rights reserved.

Entities:  

Mesh:

Year:  2011        PMID: 22071564     DOI: 10.1016/j.brainres.2011.10.012

Source DB:  PubMed          Journal:  Brain Res        ISSN: 0006-8993            Impact factor:   3.252


  3 in total

1.  Synchronization-based computation through networks of coupled oscillators.

Authors:  Daniel Malagarriga; Mariano A García-Vellisca; Alessandro E P Villa; Javier M Buldú; Jordi García-Ojalvo; Antonio J Pons
Journal:  Front Comput Neurosci       Date:  2015-08-04       Impact factor: 2.380

2.  An attractor-based complexity measurement for Boolean recurrent neural networks.

Authors:  Jérémie Cabessa; Alessandro E P Villa
Journal:  PLoS One       Date:  2014-04-11       Impact factor: 3.240

3.  Pre-Synaptic Pool Modification (PSPM): A supervised learning procedure for recurrent spiking neural networks.

Authors:  Bryce Allen Bagley; Blake Bordelon; Benjamin Moseley; Ralf Wessel
Journal:  PLoS One       Date:  2020-02-24       Impact factor: 3.240

  3 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.