Papers
Topics
Authors
Recent
Search
2000 character limit reached

Accelerated optimization algorithms and ordinary differential equations: the convex non Euclidean case

Published 25 Oct 2024 in math.OC | (2410.19380v1)

Abstract: We study the connections between ordinary differential equations and optimization algorithms in a non-Euclidean setting. We propose a novel accelerated algorithm for minimising convex functions over a convex constrained set. This algorithm is a natural generalization of Nesterov's accelerated gradient descent method to the non-Euclidean setting and can be interpreted as an additive Runge-Kutta algorithm. The algorithm can also be derived as a numerical discretization of the ODE appearing in Krichene et al. (2015a). We use Lyapunov functions to establish convergence rates for the ODE and show that the discretizations considered achieve acceleration beyond the setting studied in Krichene et al. (2015a). Finally, we discuss how the proposed algorithm connects to various equations and algorithms in the literature.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.