| Literature DB >> 29993668 |
Peter Ashwin, Claire Postlethwaite.
Abstract
We exhibit a class of smooth continuous-state neural-inspired networks composed of simple nonlinear elements that can be made to function as a finite-state computational machine. We give an explicit construction of arbitrary finite-state virtual machines in the spatiotemporal dynamics of the network. The dynamics of the functional network can be completely characterized as a "noisy network attractor" in phase space operating in either an "excitable" or a "free-running" regime, respectively, corresponding to excitable or heteroclinic connections between states. The regime depends on the sign of an "excitability parameter." Viewing the network as a nonlinear stochastic differential equation where a deterministic (signal) and/or a stochastic (noise) input is applied to any element, we explore the influence of the signal-to-noise ratio on the error rate of the computations. The free-running regime is extremely sensitive to inputs: arbitrarily small amplitude perturbations can be used to perform computations with the system as long as the input dominates the noise. We find a counter-intuitive regime where increasing noise amplitude can lead to more, rather than less, accurate computation. We suggest that noisy network attractors will be useful for understanding neural networks that reliably and sensitively perform finite-state computations in a noisy environment.Year: 2018 PMID: 29993668 DOI: 10.1109/TNNLS.2018.2813404
Source DB: PubMed Journal: IEEE Trans Neural Netw Learn Syst ISSN: 2162-237X Impact factor: 10.451