2000 character limit reached
Randomized coordinate gradient descent almost surely escapes strict saddle points
Published 11 Aug 2025 in math.OC, cs.NA, math.DS, math.NA, and math.PR | (2508.07535v1)
Abstract: We analyze the behavior of randomized coordinate gradient descent for nonconvex optimization, proving that under standard assumptions, the iterates almost surely escape strict saddle points. By formulating the method as a nonlinear random dynamical system and characterizing neighborhoods of critical points, we establish this result through the center-stable manifold theorem.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.