Literature DB >> 23517097

Randomly connected networks have short temporal memory.

Edward Wallace1, Hamid Reza Maei, Peter E Latham.   

Abstract

The brain is easily able to process and categorize complex time-varying signals. For example, the two sentences, "It is cold in London this time of year" and "It is hot in London this time of year," have different meanings, even though the words hot and cold appear several seconds before the ends of the two sentences. Any network that can tell these sentences apart must therefore have a long temporal memory. In other words, the current state of the network must depend on events that happened several seconds ago. This is a difficult task, as neurons are dominated by relatively short time constants--tens to hundreds of milliseconds. Nevertheless, it was recently proposed that randomly connected networks could exhibit the long memories necessary for complex temporal processing. This is an attractive idea, both for its simplicity and because little tuning of recurrent synaptic weights is required. However, we show that when connectivity is high, as it is in the mammalian brain, randomly connected networks cannot exhibit temporal memory much longer than the time constants of their constituent neurons.

Entities:  

Mesh:

Year:  2013        PMID: 23517097     DOI: 10.1162/NECO_a_00449

Source DB:  PubMed          Journal:  Neural Comput        ISSN: 0899-7667            Impact factor:   2.026


  8 in total

1.  Two types of asynchronous activity in networks of excitatory and inhibitory spiking neurons.

Authors:  Srdjan Ostojic
Journal:  Nat Neurosci       Date:  2014-02-23       Impact factor: 24.884

2.  Useful dynamic regimes emerge in recurrent networks.

Authors:  Vishwa Goudar; Dean V Buonomano
Journal:  Nat Neurosci       Date:  2014-04       Impact factor: 24.884

3.  Encoding sensory and motor patterns as time-invariant trajectories in recurrent neural networks.

Authors:  Vishwa Goudar; Dean V Buonomano
Journal:  Elife       Date:  2018-03-14       Impact factor: 8.140

4.  Dynamics and Information Import in Recurrent Neural Networks.

Authors:  Claus Metzner; Patrick Krauss
Journal:  Front Comput Neurosci       Date:  2022-04-27       Impact factor: 3.387

5.  Learning Universal Computations with Spikes.

Authors:  Dominik Thalmeier; Marvin Uhlmann; Hilbert J Kappen; Raoul-Martin Memmesheimer
Journal:  PLoS Comput Biol       Date:  2016-06-16       Impact factor: 4.475

6.  At the Edge of Chaos: How Cerebellar Granular Layer Network Dynamics Can Provide the Basis for Temporal Filters.

Authors:  Christian Rössert; Paul Dean; John Porrill
Journal:  PLoS Comput Biol       Date:  2015-10-20       Impact factor: 4.475

7.  Scale free topology as an effective feedback system.

Authors:  Alexander Rivkind; Hallel Schreier; Naama Brenner; Omri Barak
Journal:  PLoS Comput Biol       Date:  2020-05-11       Impact factor: 4.475

8.  The Dynamics of Balanced Spiking Neuronal Networks Under Poisson Drive Is Not Chaotic.

Authors:  Qing-Long L Gu; Zhong-Qi K Tian; Gregor Kovačič; Douglas Zhou; David Cai
Journal:  Front Comput Neurosci       Date:  2018-06-28       Impact factor: 2.380

  8 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.