Overcoming the Curse of Dimensionality in Neural Networks
Abstract: Let $A$ be a set and $V$ a real Hilbert space. Let $H$ be a real Hilbert space of functions $f:A\to V$ and assume $H$ is continuously embedded in the Banach space of bounded functions. For $i=1,\cdots,n$, let $(x_i,y_i)\in A\times V$ comprise our dataset. Let $0<q<1$ and $f*\in H$ be the unique global minimizer of the functional \begin{equation*} u(f) = \frac{q}{2}\Vert f\Vert_{H}{2} + \frac{1-q}{2n}\sum_{i=1}{n}\Vert f(x_i)-y_i\Vert_{V}{2}. \end{equation*} In this paper we show that for each $k\in\mathbb{N}$ there exists a two layer network where the first layer has $k$ functions which are Riesz representations in the Hilbert space $H$ of point evaluation functionals and the second layer is a weighted sum of the first layer, such that the functions $f_k$ realized by these networks satisfy \begin{equation*} \Vert f_{k}-f*\Vert_{H}{2} \leq \Bigl( o(1) + \frac{C}{q2} E\bigl[ \Vert Du_{I}(f)\Vert_{H{}}{2} \bigr] \Bigr)\frac{1}{k}. \end{equation*} %Let us note that $x_i$ do not need to be in a linear space and $y_i$ are in a possibly infinite dimensional Hilbert space $V$. %The error estimate is independent of the data size $n$ and in the case $V$ is finite dimensional %the error estimate is also independent of the dimension of $V$. By choosing the Hilbert space $H$ appropriately, the computational complexity of evaluating the Riesz representations of point evaluations might be small and thus the network has low computational complexity.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.