Papers
Topics
Authors
Recent
Search
2000 character limit reached

LeAP-SSN: A Semismooth Newton Method with Global Convergence Rates

Published 22 Aug 2025 in math.OC | (2508.16468v1)

Abstract: We propose LeAP-SSN (Levenberg--Marquardt Adaptive Proximal Semismooth Newton method), a semismooth Newton-type method with a simple, parameter-free globalisation strategy that guarantees convergence from arbitrary starting points in nonconvex settings to stationary points, and under a Polyak--Lojasiewicz condition, to a global minimum, in Hilbert spaces. The method employs an adaptive Levenberg--Marquardt regularisation for the Newton steps, combined with backtracking, and does not require knowledge of problem-specific constants. We establish global nonasymptotic rates: $\mathcal{O}(1/k)$ for convex problems in terms of objective values, $\mathcal{O}(1/\sqrt{k})$ under nonconvexity in terms of subgradients, and linear convergence under a Polyak--Lojasiewicz condition. The algorithm achieves superlinear convergence under mild semismoothness and Dennis--Mor\'e or partial smoothness conditions, even for non-isolated minimisers. By combining strong global guarantees with superlinear local rates in a fully parameter-agnostic framework, LeAP-SSN bridges the gap between globally convergent algorithms and the fast asymptotics of Newton's method. The practical efficiency of the method is illustrated on representative problems from imaging, contact mechanics, and machine learning.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 1 like about this paper.