Integral Representations of Sobolev Spaces via ReLU$^k$ Activation Function and Optimal Error Estimates for Linearized Networks
Abstract: This paper presents two main theoretical results concerning shallow neural networks with ReLU$k$ activation functions. We establish a novel integral representation for Sobolev spaces, showing that every function in $H{\frac{d+2k+1}{2}}(\Omega)$ can be expressed as an $L2$-weighted integral of ReLU$k$ ridge functions over the unit sphere. This result mirrors the known representation of Barron spaces and highlights a fundamental connection between Sobolev regularity and neural network representations. Moreover, we prove that linearized shallow networks -- constructed by fixed inner parameters and optimizing only the linear coefficients -- achieve optimal approximation rates $O(n{-\frac{1}{2}-\frac{2k+1}{2d}})$ in Sobolev spaces.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.