Papers
Topics
Authors
Recent
Search
2000 character limit reached

Sharp Lower Bounds for Linearized ReLU^k Approximation on the Sphere

Published 5 Oct 2025 in math.NA, cs.LG, and cs.NA | (2510.04060v1)

Abstract: We prove a saturation theorem for linearized shallow ReLU$k$ neural networks on the unit sphere $\mathbb Sd$. For any antipodally quasi-uniform set of centers, if the target function has smoothness $r>\tfrac{d+2k+1}{2}$, then the best $\mathcal{L}2(\mathbb Sd)$ approximation cannot converge faster than order $n{-\frac{d+2k+1}{2d}}$. This lower bound matches existing upper bounds, thereby establishing the exact saturation order $\tfrac{d+2k+1}{2d}$ for such networks. Our results place linearized neural-network approximation firmly within the classical saturation framework and show that, although ReLU$k$ networks outperform finite elements under equal degrees $k$, this advantage is intrinsically limited.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.