Papers
Topics
Authors
Recent
Search
2000 character limit reached

Subgradient Regularization: A Descent-Oriented Subgradient Method for Nonsmooth Optimization

Published 11 May 2025 in math.OC | (2505.07143v1)

Abstract: In nonsmooth optimization, a negative subgradient is not necessarily a descent direction, making the design of convergent descent methods based on zeroth-order and first-order information a challenging task. The well-studied bundle methods and gradient sampling algorithms construct descent directions by aggregating subgradients at nearby points in seemingly different ways, and are often complicated or lack deterministic guarantees. In this work, we identify a unifying principle behind these approaches, and develop a general framework of descent methods under the abstract principle that provably converge to stationary points. Within this framework, we introduce a simple yet effective technique, called subgradient regularization, to generate stable descent directions for a broad class of nonsmooth marginal functions, including finite maxima or minima of smooth functions. When applied to the composition of a convex function with a smooth map, the method naturally recovers the prox-linear method and, as a byproduct, provides a new dual interpretation of this classical algorithm. Numerical experiments demonstrate the effectiveness of our methods on several challenging classes of nonsmooth optimization problems, including the minimization of Nesterov's nonsmooth Chebyshev-Rosenbrock function.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.