Randomized Submanifold Subgradient Method for Optimization over Stiefel Manifolds
Abstract: Optimization over the Stiefel manifold is a fundamental computational problem in many scientific and engineering applications. Despite considerable research effort, high-dimensional optimization problems over the Stiefel manifold remain challenging, particularly when the objective function is nonsmooth. In this paper, we propose a novel coordinate-type algorithm, named \emph{randomized submanifold subgradient method} (RSSM), for minimizing a possibly nonsmooth weakly convex function over the Stiefel manifold and study its convergence behavior. Similar to coordinate-type algorithms in the Euclidean setting, RSSM exhibits low per-iteration cost and is suitable for high-dimensional problems. We prove that RSSM has an iteration complexity of $\mathcal O(\varepsilon{-4})$ for driving a natural stationarity measure below $\varepsilon$, both in expectation and in almost-sure senses. To the best of our knowledge, this is the first convergence guarantee for coordinate-type algorithms for nonsmooth optimization over the Stiefel manifold. To establish the said guarantee, we develop two new theoretical tools, namely a Riemannian subgradient inequality for weakly convex functions on proximally smooth matrix manifolds and an averaging operator that induces an adaptive metric on the ambient Euclidean space, which could be of independent interest. Lastly, we present numerical results on robust subspace recovery and orthogonal dictionary learning to demonstrate the viability of our proposed method.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.