Papers
Topics
Authors
Recent
Search
2000 character limit reached

On Relatively Smooth Optimization over Riemannian Manifolds

Published 5 Aug 2025 in math.OC | (2508.03048v1)

Abstract: We study optimization over Riemannian embedded submanifolds, where the objective function is relatively smooth in the ambient Euclidean space. Such problems have broad applications but are still largely unexplored. We introduce two Riemannian first-order methods, namely the retraction-based and projection-based Riemannian Bregman gradient methods, by incorporating the Bregman distance into the update steps. The retraction-based method can handle nonsmooth optimization; at each iteration, the update direction is generated by solving a convex optimization subproblem constrained to the tangent space. We show that when the reference function is of the quartic form $h(x) = \frac{1}{4}|x|4 + \frac{1}{2}|x|2$, the constraint subproblem admits a closed-form solution. The projection-based approach can be applied to smooth Riemannian optimization, which solves an unconstrained subproblem in the ambient Euclidean space. Both methods are shown to achieve an iteration complexity of $\mathcal{O}(1/\epsilon2)$ for finding an $\epsilon$-approximate Riemannian stationary point. When the manifold is compact, we further develop stochastic variants and establish a sample complexity of $\mathcal{O}(1/\epsilon4)$. Numerical experiments on the nonlinear eigenvalue problem and low-rank quadratic sensing problem demonstrate the advantages of the proposed methods.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.