Riemannian conditional gradient methods for composite optimization problems
Abstract: In this paper, we propose Riemannian conditional gradient methods for minimizing composite functions, i.e., those that can be expressed as the sum of a smooth function and a geodesically convex one. We analyze the convergence of the proposed algorithms, utilizing three types of step-size strategies: adaptive, diminishing, and those based on the Armijo condition. We establish the convergence rate of (\mathcal{O}(1/k)) for the adaptive and diminishing step sizes, where (k) denotes the number of iterations. Additionally, we derive an iteration complexity of (\mathcal{O}(1/\epsilon2)) for the Armijo step-size strategy to achieve (\epsilon)-optimality, where (\epsilon) is the optimality tolerance. Finally, the effectiveness of our algorithms is validated through some numerical experiments performed on the sphere and Stiefel manifolds.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.