Literature DB >> 29993668

Sensitive Finite-State Computations Using a Distributed Network With a Noisy Network Attractor.

Peter Ashwin, Claire Postlethwaite.   

Abstract

We exhibit a class of smooth continuous-state neural-inspired networks composed of simple nonlinear elements that can be made to function as a finite-state computational machine. We give an explicit construction of arbitrary finite-state virtual machines in the spatiotemporal dynamics of the network. The dynamics of the functional network can be completely characterized as a "noisy network attractor" in phase space operating in either an "excitable" or a "free-running" regime, respectively, corresponding to excitable or heteroclinic connections between states. The regime depends on the sign of an "excitability parameter." Viewing the network as a nonlinear stochastic differential equation where a deterministic (signal) and/or a stochastic (noise) input is applied to any element, we explore the influence of the signal-to-noise ratio on the error rate of the computations. The free-running regime is extremely sensitive to inputs: arbitrarily small amplitude perturbations can be used to perform computations with the system as long as the input dominates the noise. We find a counter-intuitive regime where increasing noise amplitude can lead to more, rather than less, accurate computation. We suggest that noisy network attractors will be useful for understanding neural networks that reliably and sensitively perform finite-state computations in a noisy environment.

Year:  2018        PMID: 29993668     DOI: 10.1109/TNNLS.2018.2813404

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw Learn Syst        ISSN: 2162-237X            Impact factor:   10.451


  1 in total

1.  Excitable networks for finite state computation with continuous time recurrent neural networks.

Authors:  Peter Ashwin; Claire Postlethwaite
Journal:  Biol Cybern       Date:  2021-10-05       Impact factor: 2.086

  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.