Papers
Topics
Authors
Recent
Search
2000 character limit reached

On the capacity of neural networks

Published 2 Nov 2022 in cond-mat.dis-nn, physics.comp-ph, and quant-ph | (2211.07531v1)

Abstract: The aim of this thesis is to compare the capacity of different models of neural networks. We start by analysing the problem solving capacity of a single perceptron using a simple combinatorial argument. After some observations on the storage capacity of a basic network, known as an associative memory, we introduce a powerful statistical mechanical approach to calculate its capacity in the training rule-dependent Hopfield model. With the aim of finding a more general definition that can be applied even to quantum neural nets, we then follow Gardner's work, which let us get rid of the dependency on the training rule, and comment the results obtained by Lewenstein et al. by applying Gardner's methods on a recently proposed quantum perceptron model.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.