Papers
Topics
Authors
Recent
Search
2000 character limit reached

Universal approximation properties of shallow quadratic neural networks

Published 4 Oct 2021 in math.NA and cs.NA | (2110.01536v2)

Abstract: In this paper we study shallow neural network functions which are linear combinations of compositions of activation and quadratic functions, replacing standard affine linear functions, often called neurons. We show the universality of this approximation and prove convergence rates results based on the theory of wavelets and statistical learning. We show for simple test cases that this ansatz requires a smaller numbers of neurons than standard affine linear neural networks. Moreover, we investigate the efficiency of this approach for clustering tasks with the MNIST data set. Similar observations are made when comparing deep (multi-layer) networks.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.