Online learning without catastrophic forgetting in deep spiking neural networks

Determine training methodologies for deep spiking neural networks that enable online learning while simultaneously avoiding catastrophic forgetting in continual, real-time settings.

Background

The paper reviews the status of training methods for spiking neural networks (SNNs), noting that this area remains in its early phases compared to traditional artificial neural networks. While surrogate gradient techniques and backpropagation through time are used to mitigate issues like the dead neuron problem, robust solutions for continual adaptation are not yet mature.

The authors emphasize the need for SNN training approaches that work in online learning scenarios—where data arrives sequentially and models must update continuously—without succumbing to catastrophic forgetting, a phenomenon where newly learned information overwrites previously acquired knowledge. This challenge is particularly acute in real-time systems with small or unit batch sizes, as later discussed in the methodology section in the context of catastrophic forgetting risks.

References

Training deep spiking neural networks is in its early phases and there is an important open question about their training such us let on-line learning while avoiding catastrophic forgetting .

Brain-Inspired Quantum Neural Architectures for Pattern Recognition: Integrating QSNN and QLSTM  (2505.01735 - Andrés et al., 3 May 2025) in Section 2.1 (Spiking Neural Networks)