Literature DB >> 34699370

Cellular Automata Can Reduce Memory Requirements of Collective-State Computing.

Denis Kleyko, Edward Paxon Frady, Friedrich T Sommer.   

Abstract

Various nonclassical approaches of distributed information processing, such as neural networks, reservoir computing (RC), vector symbolic architectures (VSAs), and others, employ the principle of collective-state computing. In this type of computing, the variables relevant in computation are superimposed into a single high-dimensional state vector, the collective state. The variable encoding uses a fixed set of random patterns, which has to be stored and kept available during the computation. In this article, we show that an elementary cellular automaton with rule 90 (CA90) enables the space-time tradeoff for collective-state computing models that use random dense binary representations, i.e., memory requirements can be traded off with computation running CA90. We investigate the randomization behavior of CA90, in particular, the relation between the length of the randomization period and the size of the grid, and how CA90 preserves similarity in the presence of the initialization noise. Based on these analyses, we discuss how to optimize a collective-state computing model, in which CA90 expands representations on the fly from short seed patterns-rather than storing the full set of random patterns. The CA90 expansion is applied and tested in concrete scenarios using RC and VSAs. Our experimental results show that collective-state computing with CA90 expansion performs similarly compared to traditional collective-state models, in which random patterns are generated initially by a pseudorandom number generator and then stored in a large memory.

Entities:  

Year:  2022        PMID: 34699370      PMCID: PMC9215349          DOI: 10.1109/TNNLS.2021.3119543

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw Learn Syst        ISSN: 2162-237X            Impact factor:   14.255


  18 in total

1.  Real-time computing without stable states: a new framework for neural computation based on perturbations.

Authors:  Wolfgang Maass; Thomas Natschläger; Henry Markram
Journal:  Neural Comput       Date:  2002-11       Impact factor: 2.026

2.  Holographic reduced representations.

Authors:  T A Plate
Journal:  IEEE Trans Neural Netw       Date:  1995

3.  Holographic Graph Neuron: A Bioinspired Architecture for Pattern Processing.

Authors:  Denis Kleyko; Evgeny Osipov; Alexander Senior; Asad I Khan; Yasar Ahmet Sekercioglu
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2016-03-11       Impact factor: 10.451

4.  Resonator Networks, 2: Factorization Performance and Capacity Compared to Optimization-Based Methods.

Authors:  Spencer J Kent; E Paxon Frady; Friedrich T Sommer; Bruno A Olshausen
Journal:  Neural Comput       Date:  2020-10-20       Impact factor: 2.026

5.  Resonator Networks, 1: An Efficient Solution for Factoring High-Dimensional, Distributed Representations of Data Structures.

Authors:  E Paxon Frady; Spencer J Kent; Bruno A Olshausen; Friedrich T Sommer
Journal:  Neural Comput       Date:  2020-10-20       Impact factor: 2.026

6.  Density Encoding Enables Resource-Efficient Randomly Connected Neural Networks.

Authors:  Denis Kleyko; Mansour Kheffache; E Paxon Frady; Urban Wiklund; Evgeny Osipov
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2021-08-03       Impact factor: 10.451

7.  A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks.

Authors:  E Paxon Frady; Denis Kleyko; Friedrich T Sommer
Journal:  Neural Comput       Date:  2018-04-13       Impact factor: 2.026

8.  Integer Echo State Networks: Efficient Reservoir Computing for Digital Hardware.

Authors:  Denis Kleyko; Edward Paxon Frady; Mansour Kheffache; Evgeny Osipov
Journal:  IEEE Trans Neural Netw Learn Syst       Date:  2022-04-04       Impact factor: 10.451

9.  Neural Variability and Sampling-Based Probabilistic Representations in the Visual Cortex.

Authors:  Gergő Orbán; Pietro Berkes; József Fiser; Máté Lengyel
Journal:  Neuron       Date:  2016-10-19       Impact factor: 17.173

View more
  1 in total

1.  Efficient emotion recognition using hyperdimensional computing with combinatorial channel encoding and cellular automata.

Authors:  Alisha Menon; Anirudh Natarajan; Reva Agashe; Daniel Sun; Melvin Aristio; Harrison Liew; Yakun Sophia Shao; Jan M Rabaey
Journal:  Brain Inform       Date:  2022-06-27
  1 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.