Papers
Topics
Authors
Recent
Search
2000 character limit reached

Efficient Saddle Point Escape in High Dimensions via Adaptive Perturbation and Subspace Descent

Published 19 Sep 2024 in math.OC | (2409.12604v3)

Abstract: We investigate high-dimensional non-convex optimization, focusing on the algorithmic difficulties posed by saddle points and regions of flat curvature. We develop a unified framework that integrates stochastic perturbations, curvature-adaptive learning rates, and randomized subspace descent to improve escape efficiency and scalability. Our theoretical analysis shows that gradient flow almost surely avoids strict saddles, with escape likelihood increasing exponentially in the ambient dimension. For noise-perturbed gradient descent, we derive explicit escape-time bounds that depend on curvature and noise magnitude. Adaptive step sizes further reduce escape time by adjusting to local gradient variance. To improve scalability, we establish global convergence rates for randomized subspace descent using projections of logarithmic dimension that preserve descent direction with high probability. Numerical experiments on nonlinear and constrained objectives validate these results and demonstrate practical robustness in large-scale settings.

Authors (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 1 like about this paper.