Papers
Topics
Authors
Recent
Search
2000 character limit reached

Revisiting Randomized Smoothing: Nonsmooth Nonconvex Optimization Beyond Global Lipschitz Continuity

Published 19 Aug 2025 in math.OC | (2508.13496v1)

Abstract: Randomized smoothing is a widely adopted technique for optimizing nonsmooth objective functions. However, its efficiency analysis typically relies on global Lipschitz continuity, a condition rarely met in practical applications. To address this limitation, we introduce a new subgradient growth condition that naturally encompasses a wide range of locally Lipschitz functions, with the classical global Lipschitz function as a special case. Under this milder condition, we prove that randomized smoothing yields a differentiable function that satisfies certain generalized smoothness properties. To optimize such functions, we propose novel randomized smoothing gradient algorithms that, with high probability, converge to $(\delta, \epsilon)$-Goldstein stationary points and achieve a sample complexity of $\tilde{\mathcal{O}}(d{5/2}\delta{-1}\epsilon{-4})$. By incorporating variance reduction techniques, we further improve the sample complexity to $\tilde{\mathcal{O}}(d{3/2}\delta{-1}\epsilon{-3})$, matching the optimal $\epsilon$-bound under the global Lipschitz assumption, up to a logarithmic factor. Experimental results validate the effectiveness of our proposed algorithms.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (3)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 4 likes about this paper.