Learning in Associative Networks through Pavlovian Dynamics
Abstract: Hebbian learning theory is rooted in Pavlov's Classical Conditioning. While mathematical models of the former have been proposed and studied in the past decades, especially in spin glass theory, only recently it has been numerically shown that it is possible to write neural and synaptic dynamics that mirror Pavlov conditioning mechanisms and also give rise to synaptic weights that correspond to the Hebbian learning rule. In this paper, we show that the same dynamics can be derived with equilibrium statistical mechanics tools and basic and motivated modeling assumptions. Then, we show how to study the resulting system of coupled stochastic differential equations assuming the reasonable separation of neural and synaptic timescale. In particular, we analytically demonstrate that this synaptic evolution converges to the Hebbian learning rule in various settings and compute the variance of the stochastic process. Finally, drawing from evidence on pure memory reinforcement during sleep stages, we show how the proposed model can simulate neural networks that undergo sleep-associated memory consolidation processes, thereby proving the compatibility of Pavlovian learning with dreaming mechanisms.
- Dreaming neural networks: Rigorous results. Journal of Statistical Mechanics: Theory and Experiment, 2019(8).
- From Pavlov conditioning to Hebb Learning. Neural Computation, 35(5):930–957.
- Storing infinite numbers of patterns in a spin-glass model of neural networks. Physical Review Letters, 14(55):123304.
- Amit, D. J. (1989). Modeling brain function: The world of attractor neural networks. Cambridge university press.
- Recurrent neural networks that generalize from examples and optimize by dreaming. arXiv.
- An inference problem in a mismatched setting: a spin-glass model with mattis interaction. SciPost Physics, 12(4):125.
- Theory of Neural Information Processing Systems. Oxford University Press.
- Associative (not Hebbian) learning and the mirror neuron system. Neuroscience Letters, 540:28–36.
- The function of dream sleep. Nature, 304(5922):111–114.
- Dreaming neural networks: Forgetting spurious memories and reinforcing pure ones. Neural Networks, 112:24–40.
- Optimal storage properties of neural network models. J. Phys. A, 21:271.
- Neuroscience-Inspired Artificial Intelligence. Neuron, 95(2):245–258.
- Hebb, D. (1949). The organization of behavior. Wiley Press.
- Increasing the efficiency of a neural network through unlearning. Physica A, 163:386–392.
- Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the national academy of sciences, 79(8):2554–2558.
- Associative recall of memory without errors. Phys. Rev. A, 35:380.
- Associative recall of memory without errors. Physical Review A, 35(1):380.
- Hebbian learning and predictive mirror neurons for actions, sensations and emotions. Philosophical Transactions of the Royal Society B: Biological Sciences, 369(1644).
- Kohonen, T. (1984). Self organization and associative memory. Springer-Verlag.
- Dynamic stochastic synapses as computational units. Advances in neural information processing systems, 10.
- Spin glass theory and beyond. World Scientific Publishing.
- Information storage and retrieval in spin-glass like neural networks. Journal of Physics Letters, 46:L–359–365.
- The modified unlearning procedure for enhancing storage capacity in hopfield network. IEEE Transactions, page 242.
- Shatz, C. J. (1992). The developing brain. Scientific American, 267:60–67.
- Tuckwell, H. C. (1988). Introduction to theoretical neurobiology: linear cable theory and dendritic structure, volume 1. Cambridge University Press.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.