Papers
Topics
Authors
Recent
Search
2000 character limit reached

Mixed memories in Hopfield networks

Published 7 Apr 2025 in math.PR | (2504.04879v1)

Abstract: We consider the class of Hopfield models of associative memory with activation function $F$ and state space ${-1,1}N$, where each vertex of the cube describes a configuration of $N$ binary neurons. $M$ randomly chosen configurations, called patterns, are stored using an energy function designed to make them local minima. If they are, which is known to depend on how $M$ scales with $N$, then they can be retrieved using a dynamics that decreases the energy. However, storing the patterns in the energy function also creates unintended local minima, and thus false memories. Although this has been known since the earliest work on the subject, it has only been supported by numerical simulations and non-rigorous calculations, except in elementary cases. Our results are twofold. For a generic function $F$, we explicitly construct a set of configurations, called mixed memories, whose properties are intended to characterise the local minima of the energy function. For three prominent models, namely the classical, the dense and the modern Hopfield models, obtained for quadratic, polynomial and exponential functions $F$ respectively, we give conditions on the growth rate of $M$ which guarantee that, as $N$ diverges, mixed memories are fixed points of the retrieval dynamics and thus exact minima of the energy. We conjecture that in this regime, all local minima are mixed memories.

Authors (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.