A temporally and spatially local spike-based backpropagation algorithm to enable training in hardware
A temporally and spatially local spike-based backpropagation algorithm to enable training in hardware
Blog Article
Spiking neural networks (SNNs) have emerged as a hardware efficient architecture for classification tasks.The challenge of spike-based encoding has been the lack of a universal training mechanism performed entirely using spikes.There have been several attempts to adopt the powerful backpropagation (BP) technique used in non-spiking artificial neural networks (ANNs): (1) SNNs can be trained by externally computed numerical gradients.
(2) A major advancement towards native spike-based learning has been the use of approximate BP using spike-time dependent plasticity with phased forward/backward passes.However, the transfer of information between such phases for gradient and weight update calculation necessitates external memory Rangemaster Longstock Deluxe 110cm Induction Hob Range Cooker Black and computational access.This is a challenge for standard neuromorphic hardware implementations.
In this paper, we propose a stochastic SNN based back-prop (SSNN-BP) algorithm that utilizes a composite neuron to simultaneously compute the forward pass activations and backward pass gradients explicitly with spikes.Although signed gradient values are a challenge for spike-based representation, we tackle this by splitting the gradient signal into positive and negative streams.The composite neuron encodes information in the form of stochastic spike-trains and converts Wash Pump Motor BP weight updates into temporally and spatially local spike coincidence updates compatible with hardware-friendly resistive processing units.
Furthermore, we characterize the quantization effect of discrete spike-based weight update to show that our method approaches BP ANN baseline with sufficiently long spike-trains.Finally, we show that the well-performing softmax cross-entropy loss function can be implemented through inhibitory lateral connections enforcing a winner take all rule.Our SNN with a two-layer network shows excellent generalization through comparable performance to ANNs with equivalent architecture and regularization parameters on static image datasets like MNIST, Fashion-MNIST, Extended MNIST, and temporally encoded image datasets like Neuromorphic MNIST datasets.
Thus, SSNN-BP enables BP compatible with purely spike-based neuromorphic hardware.