Modified Armijo line search in optimization on Riemannian submanifolds with reduced computational cost
Abstract: For optimization problems on Riemannian manifolds, many types of globally convergent algorithms have been proposed, and they are often equipped with the Riemannian version of the Armijo line search for global convergence. Such existing methods need to compute the value of a retraction mapping regarding the search direction several times at each iteration; this may result in high computational costs, particularly if computing the value of the retraction is expensive. To address this issue, this study focuses on Riemannian submanifolds of the Euclidean spaces and proposes a novel Riemannian line search that achieves lower computational cost by incorporating a new strategy that computes the retraction only when inevitable. A class of Riemannian optimization algorithms, including the steepest descent and Newton methods, with the new line search strategy is proposed and proved to be globally convergent. Furthermore, numerical experiments on solving optimization problems on several types of Riemannian submanifolds illustrate that the proposed methods are superior to the standard Riemannian Armijo line search-based methods.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.