Supervised Learning in Spiking Neural Networks with Synaptic Delay-Weight Plasticity

Malu Zhang, Jibin Wu, Ammar Belatreche, Zihan Pan, Xiurui Xie, Yansong Chua, Guoqi Li, Hong Qu, Haizhou Li

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)
4 Downloads (Pure)

Abstract

Spiking neurons encode information through their spiking temporal patterns. Although the precise spike-timing based encoding scheme has long been recognised, the exact mechanism that underlies the learning of such precise spike-timing in the brain remains an open question. Most of the existing learning methods for spiking neurons are based on synaptic weight adjustment. However, biological evidences suggest that synaptic delays can also be modulated to play an important role in the learning process. This paper investigates the viability of integrating synaptic delay plasticity into supervised learning and proposes a novel learning method that adjusts both the synaptic delays and weights of the learning neurons to make them fire precisely timed spikes, that is referred to as synaptic delay-weight plasticity. Remote Supervised Method (ReSuMe) and Perceptron Based Spiking Neuron Learning Rule (PBSNLR), two representative supervised learning methods, are studied to illustrate how the synaptic delay-weight plasticity works. The performance of the proposed learning method is thoroughly evaluated on synthetic data and is further demonstrated on real-world classification tasks. The experiments show that the synaptic delay-weight learning method outperforms the traditional synaptic weight learning methods in many ways.
Original languageEnglish
Pages (from-to)103-118
Number of pages16
JournalNeurocomputing
Volume409
Early online date12 Apr 2020
DOIs
Publication statusPublished - 7 Oct 2020

Fingerprint Dive into the research topics of 'Supervised Learning in Spiking Neural Networks with Synaptic Delay-Weight Plasticity'. Together they form a unique fingerprint.

Cite this