Literature DB >> 34534091

Rectified Linear Postsynaptic Potential Function for Backpropagation in Deep Spiking Neural Networks.

Malu Zhang, Jiadong Wang, Jibin Wu, Ammar Belatreche, Burin Amornpaisannon, Zhixuan Zhang, Venkata Pavan Kumar Miriyala, Hong Qu, Yansong Chua, Trevor E Carlson, Haizhou Li.   

Abstract

Spiking neural networks (SNNs) use spatiotemporal spike patterns to represent and transmit information, which are not only biologically realistic but also suitable for ultralow-power event-driven neuromorphic implementation. Just like other deep learning techniques, deep SNNs (DeepSNNs) benefit from the deep architecture. However, the training of DeepSNNs is not straightforward because the well-studied error backpropagation (BP) algorithm is not directly applicable. In this article, we first establish an understanding as to why error BP does not work well in DeepSNNs. We then propose a simple yet efficient rectified linear postsynaptic potential function (ReL-PSP) for spiking neurons and a spike-timing-dependent BP (STDBP) learning algorithm for DeepSNNs where the timing of individual spikes is used to convey information (temporal coding), and learning (BP) is performed based on spike timing in an event-driven manner. We show that DeepSNNs trained with the proposed single spike time-based learning algorithm can achieve the state-of-the-art classification accuracy. Furthermore, by utilizing the trained model parameters obtained from the proposed STDBP learning algorithm, we demonstrate ultralow-power inference operations on a recently proposed neuromorphic inference accelerator. The experimental results also show that the neuromorphic hardware consumes 0.751 mW of the total power consumption and achieves a low latency of 47.71 ms to classify an image from the Modified National Institute of Standards and Technology (MNIST) dataset. Overall, this work investigates the contribution of spike timing dynamics for information encoding, synaptic plasticity, and decision-making, providing a new perspective to the design of future DeepSNNs and neuromorphic hardware.

Entities:  

Mesh:

Year:  2022        PMID: 34534091     DOI: 10.1109/TNNLS.2021.3110991

Source DB:  PubMed          Journal:  IEEE Trans Neural Netw Learn Syst        ISSN: 2162-237X            Impact factor:   10.451


  4 in total

1.  Tasseled Crop Rows Detection Based on Micro-Region of Interest and Logarithmic Transformation.

Authors:  Zhenling Yang; Yang Yang; Chaorong Li; Yang Zhou; Xiaoshuang Zhang; Yang Yu; Dan Liu
Journal:  Front Plant Sci       Date:  2022-06-27       Impact factor: 6.627

2.  Memory-inspired spiking hyperdimensional network for robust online learning.

Authors:  Zhuowen Zou; Haleh Alimohamadi; Ali Zakeri; Farhad Imani; Yeseong Kim; M Hassan Najafi; Mohsen Imani
Journal:  Sci Rep       Date:  2022-05-10       Impact factor: 4.996

3.  Analyzing time-to-first-spike coding schemes: A theoretical approach.

Authors:  Lina Bonilla; Jacques Gautrais; Simon Thorpe; Timothée Masquelier
Journal:  Front Neurosci       Date:  2022-09-26       Impact factor: 5.152

4.  Visual explanations from spiking neural networks using inter-spike intervals.

Authors:  Youngeun Kim; Priyadarshini Panda
Journal:  Sci Rep       Date:  2021-09-24       Impact factor: 4.379

  4 in total

北京卡尤迪生物科技股份有限公司 © 2022-2023.