Christian Klos, University of Bonn: Smooth, exact and efficient event-based training of spiking neural networks
| When |
Jan 28, 2026
from 12:15 AM to 01:00 PM |
|---|---|
| Where | Bernstein Center Freiburg, Hansastr. 9a, 79104 Freiburg, Lecture Hall, ground floor |
| Contact Name | Martina Bacher |
| Add event to calendar |
|
Abstract
The ability to train spiking neural network models is essential for modeling biological neural networks and neuromorphic computing. For training non-spiking neural networks, gradient descent is the standard approach. Its application to spiking neural networks is, however, complicated by the discreteness of spikes, which can lead to disruptive (dis)appearances of spikes during training and hinders gradient computation. To overcome these complications, in the presently most popular approach, timestep-based simulations are used and backpropagation is modified to yield nonzero, surrogate gradients.
In my talk, I will present a novel, alternative approach that enables efficient smooth exact gradient descent training. It uses event-based simulations and a new, memory-efficient method to compute numerically precise exact gradients. I will then demonstrate that, perhaps surprisingly, spiking dynamics that change continuously or even smoothly, i.e. in particular without disruptive spike (dis)appearances, can be achieved. Among others, neuron models that generate spikes via a self-amplification mechanism, like real neurons, and reach infinite voltage in finite time generate such dynamics; this includes the standard quadratic leaky integrate-and-fire neuron. Such neuron models also enable gradient-based spike removal and addition. The latter is based on what we call pseudospikes, which even allow the training of deep, initially silent networks.
I will illustrate our scheme using standard setups from theoretical neuroscience, namely balanced networks, and neuromorphic computing, namely MNIST and the Spiking Heidelberg Digits dataset.
Reference C. Klos and R.-M. Memmesheimer, Smooth exact gradient descent learning in spiking neural networks, Phys. Rev. Lett. 134, 027301 (2025)
