Literature DB >> 29328958

STDP-based spiking deep convolutional neural networks for object recognition.

Saeed Reza Kheradpisheh1, Mohammad Ganjtabesh2, Simon J Thorpe3, Timothée Masquelier4.   

Abstract

Previous studies have shown that spike-timing-dependent plasticity (STDP) can be used in spiking neural networks (SNN) to extract visual features of low or intermediate complexity in an unsupervised manner. These studies, however, used relatively shallow architectures, and only one layer was trainable. Another line of research has demonstrated - using rate-based neural networks trained with back-propagation - that having many layers increases the recognition robustness, an approach known as deep learning. We thus designed a deep SNN, comprising several convolutional (trainable with STDP) and pooling layers. We used a temporal coding scheme where the most strongly activated neurons fire first, and less activated neurons fire later or not at all. The network was exposed to natural images. Thanks to STDP, neurons progressively learned features corresponding to prototypical patterns that were both salient and frequent. Only a few tens of examples per category were required and no label was needed. After learning, the complexity of the extracted features increased along the hierarchy, from edge detectors in the first layer to object prototypes in the last layer. Coding was very sparse, with only a few thousands spikes per image, and in some cases the object category could be reasonably well inferred from the activity of a single higher-order neuron. More generally, the activity of a few hundreds of such neurons contained robust category information, as demonstrated using a classifier on Caltech 101, ETH-80, and MNIST databases. We also demonstrate the superiority of STDP over other unsupervised techniques such as random crops (HMAX) or auto-encoders. Taken together, our results suggest that the combination of STDP with latency coding may be a key to understanding the way that the primate visual system learns, its remarkable processing speed and its low energy consumption. These mechanisms are also interesting for artificial vision systems, particularly for hardware solutions.
Copyright © 2017 Elsevier Ltd. All rights reserved.

Entities:  

Keywords:  Deep learning; Object recognition; STDP; Spiking neural network; Temporal coding

Mesh:

Year:  2017        PMID: 29328958     DOI: 10.1016/j.neunet.2017.12.005

Source DB:  PubMed          Journal:  Neural Netw        ISSN: 0893-6080


  33 in total

1.  Backpropagation with biologically plausible spatiotemporal adjustment for training deep spiking neural networks.

Authors:  Guobin Shen; Dongcheng Zhao; Yi Zeng
Journal:  Patterns (N Y)       Date:  2022-06-02

2.  The covariance perceptron: A new paradigm for classification and processing of time series in recurrent neuronal networks.

Authors:  Matthieu Gilson; David Dahmen; Rubén Moreno-Bote; Andrea Insabato; Moritz Helias
Journal:  PLoS Comput Biol       Date:  2020-10-12       Impact factor: 4.475

3.  Event-Based Trajectory Prediction Using Spiking Neural Networks.

Authors:  Guillaume Debat; Tushar Chauhan; Benoit R Cottereau; Timothée Masquelier; Michel Paindavoine; Robin Baures
Journal:  Front Comput Neurosci       Date:  2021-05-24       Impact factor: 2.380

4.  CRBA: A Competitive Rate-Based Algorithm Based on Competitive Spiking Neural Networks.

Authors:  Paolo G Cachi; Sebastián Ventura; Krzysztof J Cios
Journal:  Front Comput Neurosci       Date:  2021-04-22       Impact factor: 2.380

5.  An Event-Driven Classifier for Spiking Neural Networks Fed with Synthetic or Dynamic Vision Sensor Data.

Authors:  Evangelos Stromatias; Miguel Soto; Teresa Serrano-Gotarredona; Bernabé Linares-Barranco
Journal:  Front Neurosci       Date:  2017-06-28       Impact factor: 4.677

6.  Unsupervised Feature Learning With Winner-Takes-All Based STDP.

Authors:  Paul Ferré; Franck Mamalet; Simon J Thorpe
Journal:  Front Comput Neurosci       Date:  2018-04-05       Impact factor: 2.380

7.  Optimal Localist and Distributed Coding of Spatiotemporal Spike Patterns Through STDP and Coincidence Detection.

Authors:  Timothée Masquelier; Saeed R Kheradpisheh
Journal:  Front Comput Neurosci       Date:  2018-09-18       Impact factor: 2.380

8.  Unsupervised speech recognition through spike-timing-dependent plasticity in a convolutional spiking neural network.

Authors:  Meng Dong; Xuhui Huang; Bo Xu
Journal:  PLoS One       Date:  2018-11-29       Impact factor: 3.240

9.  Exploring Optimized Spiking Neural Network Architectures for Classification Tasks on Embedded Platforms.

Authors:  Tehreem Syed; Vijay Kakani; Xuenan Cui; Hakil Kim
Journal:  Sensors (Basel)       Date:  2021-05-07       Impact factor: 3.576

10.  On-Chip Training Spiking Neural Networks Using Approximated Backpropagation With Analog Synaptic Devices.

Authors:  Dongseok Kwon; Suhwan Lim; Jong-Ho Bae; Sung-Tae Lee; Hyeongsu Kim; Young-Tak Seo; Seongbin Oh; Jangsaeng Kim; Kyuho Yeom; Byung-Gook Park; Jong-Ho Lee
Journal:  Front Neurosci       Date:  2020-07-07       Impact factor: 4.677

View more

北京卡尤迪生物科技股份有限公司 © 2022-2023.