Neuromorphic Incremental on-chip Learning with Hebbian Weight Consolidation
Abstract: As next-generation implantable brain-machine interfaces become pervasive on edge device, incrementally learning new tasks in bio-plasticity ways is urgently demanded for Neuromorphic chips. Due to the inherent characteristics of its structure, spiking neural networks are naturally well-suited for BMI-chips. Here we propose Hebbian Weight Consolidation, as well as an on-chip learning framework. HWC selectively masks synapse modifications for previous tasks, retaining them to store new knowledge from subsequent tasks while preserving the old knowledge. Leveraging the bio-plasticity of dendritic spines, the intrinsic self-organizing nature of Hebbian Weight Consolidation aligns naturally with the incremental learning paradigm, facilitating robust learning outcomes. By reading out spikes layer by layer and performing back-propagation on the external micro-controller unit, MLoC can efficiently accomplish on-chip learning. Experiments show that our HWC algorithm up to 23.19% outperforms lower bound that without incremental learning algorithm, particularly in more challenging monkey behavior decoding scenarios. Taking into account on-chip computing on Synsense Speck 2e chip, our proposed algorithm exhibits an improvement of 11.06%. This study demonstrates the feasibility of employing incremental learning for high-performance neural signal decoding in next-generation brain-machine interfaces.
- Neural signal analysis with memristor arrays towards high-efficiency brain–machine interfaces. Nature communications, 11(1):4234, 2020.
- Long-term stability of cortical population dynamics underlying consistent behavior. Nature neuroscience, 23(2):260–270, 2020.
- Neuromorphic computing chip with spatiotemporal elasticity for multi-intelligent-tasking robots. Science Robotics, 7(67):eabk2948, 2022.
- Recent advances and new frontiers in spiking neural networks. arXiv preprint arXiv:2204.07050, 2022.
- Complex dynamic neurons improved spiking transformer network for efficient automatic speech recognition. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 37, pages 102–109, 2023.
- Meta neurons improve spiking neural networks for efficient spatio-temporal learning. Neurocomputing, 531:217–225, 2023.
- Origin of the efficiency of spike timing-based neural computation for processing temporal information. Neural Networks, 160:84–96, 2023.
- Towards spike-based machine intelligence with neuromorphic computing. Nature, 575(7784):607–617, 2019.
- Rectification-based knowledge retention for continual learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 15282–15291, 2021.
- Large scale incremental learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 374–382, 2019.
- Neuro-modulated hebbian learning for fully test-time adaptation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 3728–3738, 2023.
- Explaining cocktail party effect and mcgurk effect with a spiking neural network improved by motif-topology. Frontiers in Neuroscience, 17:1132269, 2023.
- Multi-sacle dynamic coding improved spiking actor network for reinforcement learning. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, pages 59–67, 2022.
- Dmitry Krotov. A new frontier for hopfield networks. Nature Reviews Physics, pages 1–2, 2023.
- A survey and perspective on neuromorphic continual learning systems. Frontiers in Neuroscience, 17:1149410, 2023.
- Pathways to efficient neuromorphic computing with non-volatile memory technologies. Applied Physics Reviews, 7(2), 2020.
- Spike-timing-dependent plasticity learning of coincidence detection with passively integrated memristive circuits. Nature communications, 9(1):5311, 2018.
- Atomically sharp interface enabled ultrahigh-speed non-volatile memory devices. Nature nanotechnology, 16(8):882–887, 2021.
- Overcoming catastrophic forgetting in neural networks. Proceedings of the national academy of sciences, 114(13):3521–3526, 2017.
- Learning without forgetting. IEEE transactions on pattern analysis and machine intelligence, 40(12):2935–2947, 2017.
- Continual learning through synaptic intelligence. In International conference on machine learning, pages 3987–3995. PMLR, 2017.
- Low-latency monocular depth estimation using event timing on neuromorphic hardware. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 4070–4079, 2023.
- Maryam M Shanechi. Brain–machine interfaces from motor to mood. Nature neuroscience, 22(10):1554–1564, 2019.
- Small, correlated changes in synaptic connectivity may facilitate rapid motor learning. Nature communications, 13(1):5163, 2022.
- Three types of incremental learning. Nature Machine Intelligence, 4(12):1185–1197, 2022.
- icarl: Incremental classifier and representation learning. In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pages 2001–2010, 2017.
- Brain-inspired replay for continual learning with artificial neural networks. Nature communications, 11(1):4069, 2020.
- Meta-attention for vit-backed continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 150–159, 2022.
- Learning to prompt for continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 139–149, 2022.
- On generalizing beyond domains in cross-domain continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9265–9274, 2022.
- Gkeal: Gaussian kernel embedded analytic learning for few-shot class incremental task. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 7746–7755, 2023.
- Meta-learning with less forgetting on large-scale non-stationary task distributions. In European Conference on Computer Vision, pages 221–238. Springer, 2022.
- Forward compatible few-shot class-incremental learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 9046–9056, 2022.
- Self-sustaining representation expansion for non-exemplar class-incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9296–9305, 2022.
- Development of a neuromorphic computing system. In 2015 IEEE international electron devices meeting (IEDM), pages 4–3. IEEE, 2015.
- Loihi: A neuromorphic manycore processor with on-chip learning. Ieee Micro, 38(1):82–99, 2018.
- Neuromorphic computing hardware and neural architectures for robotics. Science Robotics, 7(67):eabl8419, 2022.
- Rapid online learning and robust recall in a neuromorphic olfactory circuit. Nature Machine Intelligence, 2(3):181–191, 2020.
- Mechanism underlying hippocampal long-term potentiation and depression based on competition between endocytosis and exocytosis of ampa receptors. Scientific reports, 10(1):14711, 2020.
- Modeling of short-term synaptic plasticity effects in zno nanowire-based memristors using a potentiation-depression rate balance equation. IEEE Transactions on Nanotechnology, 19:609–612, 2020.
- Self-promoted prototype refinement for few-shot class-incremental learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 6801–6810, 2021.
- Distilling causal effect of data in class-incremental learning. In Proceedings of the IEEE/CVF conference on Computer Vision and Pattern Recognition, pages 3957–3966, 2021.
- Self-backpropagation of synaptic modifications elevates the efficiency of spiking and artificial neural networks. Science advances, 7(43):eabh0146, 2021.
- Glsnn: A multi-layer spiking neural network based on global feedback alignment and local stdp plasticity. Frontiers in Computational Neuroscience, 14:576841, 2020.
- Prototype augmentation and self-supervision for incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 5871–5880, 2021.
- A brain-inspired algorithm that mitigates catastrophic forgetting of artificial and spiking neural networks with low computational cost. Science Advances, 9(34):eadi2947, 2023.
- Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization. Proceedings of the National Academy of Sciences, 115(44):E10467–E10475, 2018.
- Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114, 2013.
- Learning multiple layers of features from tiny images. 2009.
- A low power, fully event-based gesture recognition system. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 7243–7252, 2017.
- Sign backpropagation: An on-chip learning algorithm for analog rram neuromorphic computing systems. Neural Networks, 108:217–223, 2018.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.