Papers
Topics
Authors
Recent
Search
2000 character limit reached

Neuromorphic Incremental on-chip Learning with Hebbian Weight Consolidation

Published 30 Nov 2023 in cs.ET | (2311.18340v1)

Abstract: As next-generation implantable brain-machine interfaces become pervasive on edge device, incrementally learning new tasks in bio-plasticity ways is urgently demanded for Neuromorphic chips. Due to the inherent characteristics of its structure, spiking neural networks are naturally well-suited for BMI-chips. Here we propose Hebbian Weight Consolidation, as well as an on-chip learning framework. HWC selectively masks synapse modifications for previous tasks, retaining them to store new knowledge from subsequent tasks while preserving the old knowledge. Leveraging the bio-plasticity of dendritic spines, the intrinsic self-organizing nature of Hebbian Weight Consolidation aligns naturally with the incremental learning paradigm, facilitating robust learning outcomes. By reading out spikes layer by layer and performing back-propagation on the external micro-controller unit, MLoC can efficiently accomplish on-chip learning. Experiments show that our HWC algorithm up to 23.19% outperforms lower bound that without incremental learning algorithm, particularly in more challenging monkey behavior decoding scenarios. Taking into account on-chip computing on Synsense Speck 2e chip, our proposed algorithm exhibits an improvement of 11.06%. This study demonstrates the feasibility of employing incremental learning for high-performance neural signal decoding in next-generation brain-machine interfaces.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (51)
  1. Neural signal analysis with memristor arrays towards high-efficiency brain–machine interfaces. Nature communications, 11(1):4234, 2020.
  2. Long-term stability of cortical population dynamics underlying consistent behavior. Nature neuroscience, 23(2):260–270, 2020.
  3. Neuromorphic computing chip with spatiotemporal elasticity for multi-intelligent-tasking robots. Science Robotics, 7(67):eabk2948, 2022.
  4. Recent advances and new frontiers in spiking neural networks. arXiv preprint arXiv:2204.07050, 2022.
  5. Complex dynamic neurons improved spiking transformer network for efficient automatic speech recognition. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 37, pages 102–109, 2023.
  6. Meta neurons improve spiking neural networks for efficient spatio-temporal learning. Neurocomputing, 531:217–225, 2023.
  7. Origin of the efficiency of spike timing-based neural computation for processing temporal information. Neural Networks, 160:84–96, 2023.
  8. Towards spike-based machine intelligence with neuromorphic computing. Nature, 575(7784):607–617, 2019.
  9. Rectification-based knowledge retention for continual learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 15282–15291, 2021.
  10. Large scale incremental learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 374–382, 2019.
  11. Neuro-modulated hebbian learning for fully test-time adaptation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 3728–3738, 2023.
  12. Explaining cocktail party effect and mcgurk effect with a spiking neural network improved by motif-topology. Frontiers in Neuroscience, 17:1132269, 2023.
  13. Multi-sacle dynamic coding improved spiking actor network for reinforcement learning. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 36, pages 59–67, 2022.
  14. Dmitry Krotov. A new frontier for hopfield networks. Nature Reviews Physics, pages 1–2, 2023.
  15. A survey and perspective on neuromorphic continual learning systems. Frontiers in Neuroscience, 17:1149410, 2023.
  16. Pathways to efficient neuromorphic computing with non-volatile memory technologies. Applied Physics Reviews, 7(2), 2020.
  17. Spike-timing-dependent plasticity learning of coincidence detection with passively integrated memristive circuits. Nature communications, 9(1):5311, 2018.
  18. Atomically sharp interface enabled ultrahigh-speed non-volatile memory devices. Nature nanotechnology, 16(8):882–887, 2021.
  19. Overcoming catastrophic forgetting in neural networks. Proceedings of the national academy of sciences, 114(13):3521–3526, 2017.
  20. Learning without forgetting. IEEE transactions on pattern analysis and machine intelligence, 40(12):2935–2947, 2017.
  21. Continual learning through synaptic intelligence. In International conference on machine learning, pages 3987–3995. PMLR, 2017.
  22. Low-latency monocular depth estimation using event timing on neuromorphic hardware. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 4070–4079, 2023.
  23. Maryam M Shanechi. Brain–machine interfaces from motor to mood. Nature neuroscience, 22(10):1554–1564, 2019.
  24. Small, correlated changes in synaptic connectivity may facilitate rapid motor learning. Nature communications, 13(1):5163, 2022.
  25. Three types of incremental learning. Nature Machine Intelligence, 4(12):1185–1197, 2022.
  26. icarl: Incremental classifier and representation learning. In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pages 2001–2010, 2017.
  27. Brain-inspired replay for continual learning with artificial neural networks. Nature communications, 11(1):4069, 2020.
  28. Meta-attention for vit-backed continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 150–159, 2022.
  29. Learning to prompt for continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 139–149, 2022.
  30. On generalizing beyond domains in cross-domain continual learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9265–9274, 2022.
  31. Gkeal: Gaussian kernel embedded analytic learning for few-shot class incremental task. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 7746–7755, 2023.
  32. Meta-learning with less forgetting on large-scale non-stationary task distributions. In European Conference on Computer Vision, pages 221–238. Springer, 2022.
  33. Forward compatible few-shot class-incremental learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 9046–9056, 2022.
  34. Self-sustaining representation expansion for non-exemplar class-incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 9296–9305, 2022.
  35. Development of a neuromorphic computing system. In 2015 IEEE international electron devices meeting (IEDM), pages 4–3. IEEE, 2015.
  36. Loihi: A neuromorphic manycore processor with on-chip learning. Ieee Micro, 38(1):82–99, 2018.
  37. Neuromorphic computing hardware and neural architectures for robotics. Science Robotics, 7(67):eabl8419, 2022.
  38. Rapid online learning and robust recall in a neuromorphic olfactory circuit. Nature Machine Intelligence, 2(3):181–191, 2020.
  39. Mechanism underlying hippocampal long-term potentiation and depression based on competition between endocytosis and exocytosis of ampa receptors. Scientific reports, 10(1):14711, 2020.
  40. Modeling of short-term synaptic plasticity effects in zno nanowire-based memristors using a potentiation-depression rate balance equation. IEEE Transactions on Nanotechnology, 19:609–612, 2020.
  41. Self-promoted prototype refinement for few-shot class-incremental learning. In Proceedings of the IEEE/CVF conference on computer vision and pattern recognition, pages 6801–6810, 2021.
  42. Distilling causal effect of data in class-incremental learning. In Proceedings of the IEEE/CVF conference on Computer Vision and Pattern Recognition, pages 3957–3966, 2021.
  43. Self-backpropagation of synaptic modifications elevates the efficiency of spiking and artificial neural networks. Science advances, 7(43):eabh0146, 2021.
  44. Glsnn: A multi-layer spiking neural network based on global feedback alignment and local stdp plasticity. Frontiers in Computational Neuroscience, 14:576841, 2020.
  45. Prototype augmentation and self-supervision for incremental learning. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pages 5871–5880, 2021.
  46. A brain-inspired algorithm that mitigates catastrophic forgetting of artificial and spiking neural networks with low computational cost. Science Advances, 9(34):eadi2947, 2023.
  47. Alleviating catastrophic forgetting using context-dependent gating and synaptic stabilization. Proceedings of the National Academy of Sciences, 115(44):E10467–E10475, 2018.
  48. Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114, 2013.
  49. Learning multiple layers of features from tiny images. 2009.
  50. A low power, fully event-based gesture recognition system. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 7243–7252, 2017.
  51. Sign backpropagation: An on-chip learning algorithm for analog rram neuromorphic computing systems. Neural Networks, 108:217–223, 2018.
Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.