Literature DB >> 25873857

Event-driven contrastive divergence: neural sampling foundations.

Emre Neftci1, Srinjoy Das2, Bruno Pedroni3, Kenneth Kreutz-Delgado2, Gert Cauwenberghs4.   

Abstract

Entities:  

Keywords:  Markov chain Monte Carlo; neural sampling; probabilistic inference; spiking neurons; synaptic plasticity

Year:  2015        PMID: 25873857      PMCID: PMC4379871          DOI: 10.3389/fnins.2015.00104

Source DB:  PubMed          Journal:  Front Neurosci        ISSN: 1662-453X            Impact factor:   4.677


× No keyword cloud information.
In a recent Frontiers in Neuroscience paper (Neftci et al., 2014) we contributed an on-line learning rule, driven by spike-events in an Integrate and Fire (IF) neural network, that emulates the learning performance of Contrastive Divergence (CD) in an equivalent Restricted Boltzmann Machine (RBM) amenable to real-time implementation in spike-based neuromorphic systems. The event-driven CD framework assumes the foundations of neural sampling (Buesing et al., 2011; Maass, 2014) in mapping spike rates of a deterministic IF network onto probabilities of a corresponding stochastic neural network. In Neftci et al. (2014), we used a particular form of neural sampling previously analyzed in Petrovici et al. (2013), although this connection was not made sufficiently clear in the published article. The purpose of this letter is to clarify this connection, and to raise the reader's awareness to the existence of various forms of neural sampling. We highlight the differences as well as strong connections across these various forms, and suggest applications of event-driven CD in a more general setting enabled by the broader interpretations of neural sampling. In the Bayesian view on neural information processing, the cognitive function of the brain arises from its ability to encode and combine probabilities describing its interactions with an uncertain world (Doya et al., 2007). A recent neural sampling hypothesis has shed light on how probabilities may be encoded in neural circuits (Fiser et al., 2010; Berkes et al., 2011). In the neural sampling hypothesis, spikes are viewed as samples of a target probability distribution. From a modeling perspective, a key advantage of this view is that learning in spiking neural networks becomes more tractable than the alternative one, in which neurons encode probabilities, because one can borrow from well-established algorithms in machine learning (Fiser et al., 2010) (see Nessler et al., 2013 for a concrete example). Merolla et al. (2010) demonstrated a Boltzmann machine using IF neurons. In this model, spiking neurons integrate Poisson-distributed spikes during a fixed time window set by a global rhythmic oscillation. A first-passage time analysis shows that the probability that a neuron spikes in the given time window follows a logistic sigmoid function consistent with a Boltzmann distribution. The particular form of rhythmic oscillation ensures that, even when neurons are recurrently coupled, the network produces a sample of a Boltzmann distribution for each oscillation cycle. Merolla et al. (2010) also suggest an alternative, more biologically plausible forms of learning induced by rhythmic oscillations that resemble the role of theta oscillations across large neuronal ensembles. Our event-driven CD rule is compatible with Merolla et al.'s sampler because it would simply result in updating weights at every cycle of the rhythmic oscillation. Shortly after, Buesing et al. (2011) proved that abstract neuron models consistent with the behavior of biological spiking neurons (Jolivet et al., 2006) can perform Markov Chain Monte Carlo (MCMC) sampling of a Boltzmann distribution. Their sampler does not require global oscillations, although these could enable the sampling from multiple distributions within the same network (Habenschuss et al., 2013). To demonstrate the performance of the sampler, a Boltzmann machine was trained off-line using CD. Learning in this model was further extended to on-line updates in a precursor of event-driven CD (Pedroni et al., 2013). An open question was whether neuron models that describe the biological processes of nerve cells endowed with deterministic action potential generation mechanisms can support stochastic sampling as described with the more abstract spiking forms in Buesing et al. (2011). An answer to this question is relevant for understanding how neural sampling can be instantiated in biological neurons, but also for implementing neural samplers on low-power neuromorphic implementations of spiking neurons (Indiveri et al., 2011). The stochastic nature of neural sampling suggests studying the behavior of neurons under noisy inputs. The diffusion model commonly referred to as the Ornstein-Uhlenbeck process (Van Kampen, 1992) has been the basis of a standard continuous-time stochastic neuron model since the first rigorous analysis of its behavior in Capocelli and Ricciardi (1971). Petrovici et al. (2013) discuss these issues and provide a rigorous link between deterministic neuron models (leaky integrate-and-fire with conductance-based synapses) and stochastic network-level dynamics, as can be observed in vivo. In particular, they identify how the high-conductance state caused by Poissonian background bombardment can provide the fast membrane reaction time required for precise sampling. They provide analytical derivations of the activation function at the single-cell level as well as for the synaptic interaction and investigate the convergence behavior of the sampled distribution at the network level. O'Connor et al. (2013) employ the Siegert approximation of IF neurons to compute CD updates. The Siegert or diffusion approximation expresses the firing rate of an IF neuron, as a function of input firing rates, under the assumption that all inputs are independent and Poisson distributed. After learning, the parameters of the learned Boltzmann machine are transferred to the equivalent network of IF neurons. Although the off-line CD learning in O'Connor et al. (2013) operated using firing rates rather than spikes, in its basic form, it is functionally equivalent and compatible with event-driven CD under the condition that spike times are uncorrelated. Our work implements a biologically-inspired algorithm for the purposes of training Boltzmann machines (Neftci et al., 2014). We assumed a neuronal model consistent with biology and realizable in a neuromorphic implementation. Petrovici et al. (2013) provided a deeper physical and mathematical interpretation of neural sampling. Similarly to their approach, we considered the standard leaky IF neuron stimulated by non-capacitively summed pre-synaptic inputs obeying Poisson statistics. The performance of event-driven CD on the MNIST hand-written digit recognition task was robust to spike probabilities that deviate slightly from the Boltzmann distribution, even though such distributions violate the assumptions of CD formulated for training RBMs. This suggests that event-driven CD provides a general learning framework for biologically-inspired spiking RBMs and is consistent with wide range of neural samplers.

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
  10 in total

1.  Predicting spike timing of neocortical pyramidal neurons by simple threshold models.

Authors:  Renaud Jolivet; Alexander Rauch; Hans-Rudolf Lüscher; Wulfram Gerstner
Journal:  J Comput Neurosci       Date:  2006-04-22       Impact factor: 1.621

2.  Diffusion approximation and first passage time problem for a model neuron.

Authors:  R M Capocelli; L M Ricciardi
Journal:  Kybernetik       Date:  1971-06

3.  Neuromorphic silicon neuron circuits.

Authors:  Giacomo Indiveri; Bernabé Linares-Barranco; Tara Julia Hamilton; André van Schaik; Ralph Etienne-Cummings; Tobi Delbruck; Shih-Chii Liu; Piotr Dudek; Philipp Häfliger; Sylvie Renaud; Johannes Schemmel; Gert Cauwenberghs; John Arthur; Kai Hynna; Fopefolu Folowosele; Sylvain Saighi; Teresa Serrano-Gotarredona; Jayawan Wijekoon; Yingxue Wang; Kwabena Boahen
Journal:  Front Neurosci       Date:  2011-05-31       Impact factor: 4.677

4.  Spontaneous cortical activity reveals hallmarks of an optimal internal model of the environment.

Authors:  Pietro Berkes; Gergo Orbán; Máté Lengyel; József Fiser
Journal:  Science       Date:  2011-01-07       Impact factor: 47.728

Review 5.  Statistically optimal perception and learning: from behavior to neural representations.

Authors:  József Fiser; Pietro Berkes; Gergo Orbán; Máté Lengyel
Journal:  Trends Cogn Sci       Date:  2010-02-12       Impact factor: 20.229

6.  Bayesian computation emerges in generic cortical microcircuits through spike-timing-dependent plasticity.

Authors:  Bernhard Nessler; Michael Pfeiffer; Lars Buesing; Wolfgang Maass
Journal:  PLoS Comput Biol       Date:  2013-04-25       Impact factor: 4.475

7.  Stochastic computations in cortical microcircuit models.

Authors:  Stefan Habenschuss; Zeno Jonke; Wolfgang Maass
Journal:  PLoS Comput Biol       Date:  2013-11-14       Impact factor: 4.475

8.  Neural dynamics as sampling: a model for stochastic computation in recurrent networks of spiking neurons.

Authors:  Lars Buesing; Johannes Bill; Bernhard Nessler; Wolfgang Maass
Journal:  PLoS Comput Biol       Date:  2011-11-03       Impact factor: 4.475

9.  Real-time classification and sensor fusion with a spiking deep belief network.

Authors:  Peter O'Connor; Daniel Neil; Shih-Chii Liu; Tobi Delbruck; Michael Pfeiffer
Journal:  Front Neurosci       Date:  2013-10-08       Impact factor: 4.677

10.  Event-driven contrastive divergence for spiking neuromorphic systems.

Authors:  Emre Neftci; Srinjoy Das; Bruno Pedroni; Kenneth Kreutz-Delgado; Gert Cauwenberghs
Journal:  Front Neurosci       Date:  2014-01-30       Impact factor: 4.677

  10 in total
  2 in total

1.  An Event-Driven Classifier for Spiking Neural Networks Fed with Synthetic or Dynamic Vision Sensor Data.

Authors:  Evangelos Stromatias; Miguel Soto; Teresa Serrano-Gotarredona; Bernabé Linares-Barranco
Journal:  Front Neurosci       Date:  2017-06-28       Impact factor: 4.677

2.  On Practical Issues for Stochastic STDP Hardware With 1-bit Synaptic Weights.

Authors:  Amirreza Yousefzadeh; Evangelos Stromatias; Miguel Soto; Teresa Serrano-Gotarredona; Bernabé Linares-Barranco
Journal:  Front Neurosci       Date:  2018-10-15       Impact factor: 4.677

  2 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.