Papers
Topics
Authors
Recent
Search
2000 character limit reached

Integral Representations of Sobolev Spaces via ReLU$^k$ Activation Function and Optimal Error Estimates for Linearized Networks

Published 1 May 2025 in math.NA and cs.NA | (2505.00351v2)

Abstract: This paper presents two main theoretical results concerning shallow neural networks with ReLU$k$ activation functions. We establish a novel integral representation for Sobolev spaces, showing that every function in $H{\frac{d+2k+1}{2}}(\Omega)$ can be expressed as an $L2$-weighted integral of ReLU$k$ ridge functions over the unit sphere. This result mirrors the known representation of Barron spaces and highlights a fundamental connection between Sobolev regularity and neural network representations. Moreover, we prove that linearized shallow networks -- constructed by fixed inner parameters and optimizing only the linear coefficients -- achieve optimal approximation rates $O(n{-\frac{1}{2}-\frac{2k+1}{2d}})$ in Sobolev spaces.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.