Serial vs parallel recall in the Blume-Every-Griffiths neural networks
Abstract: Fully connected Blume-Emery-Griffiths neural networks performing pattern recognition and associative memory have been heuristically studied in the past (mainly via the replica trick and under the replica symmetric assumption) as generalization of the standard Hopfield reference. In these notes, at first, by relying upon Guerra interpolation, we re-obtain the existing picture rigorously. Next we show that, due to dilution in the patterns, these networks are able to switch from serial recall (where one pattern is retrieved per time) to parallel recall (where several patterns are retrieved at once) and the larger the dilution, the stronger this emerging multi-tasking capability. In particular, we inspect the regimes of mild dilution (where solely a low storage of pattern can be enabled) and extreme dilution (where a medium storage of patterns can be sustained) separately as they give rise to different outcomes: the former displays hierarchical recall (distributing the amplitudes of the retrieved signals with different amplitudes), the latter -instead- performs a equal-strength recall (where a O(1) fraction of all the patterns is simultaneously retrieved with the same amplitude per pattern). Finally, in order to implement graded responses in the neurons, variations on theme obtained by enlarging the possible values of neural activity these neurons may sustain are also discussed generalizing the Ghatak-Sherrington model for inverse freezing in Hebbian terms.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.