Papers
Topics
Authors
Recent
Search
2000 character limit reached

Worst-case Nonlinear Regression with Error Bounds

Published 18 Jan 2026 in eess.SY | (2601.12334v1)

Abstract: This paper proposes an active-learning approach to worst-case nonlinear regression with deterministic error guarantees. Given a known nonlinear function defined over a compact set, we compute a surrogate model, such as a feedforward neural network, by minimizing the maximum absolute approximation error. To address the nonsmooth nature of the resulting minimax problem, we introduce a smooth approximation of the $L_\infty$-type loss that enables efficient gradient-based training. We iteratively enrich the training set by actively learning points of largest approximation error through global optimization. The resulting models admit certified worst-case error bounds, either constant or input-dependent, over the entire input domain. The approach is demonstrated through approximations of nonlinear functions and nonconvex sets, as well as through the derivation of uncertain models of more complex nonlinear dynamics within a given model class, and the approximation of explicit model predictive control laws.

Authors (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.