Papers
Topics
Authors
Recent
Search
2000 character limit reached

Information Geometry of Exponentiated Gradient: Convergence beyond L-Smoothness

Published 7 Apr 2025 in math.OC | (2504.05136v1)

Abstract: We study the minimization of smooth, possibly nonconvex functions over the positive orthant, a key setting in Poisson inverse problems, using the exponentiated gradient (EG) method. Interpreting EG as Riemannian gradient descent (RGD) with the $e$-Exp map from information geometry as a retraction, we prove global convergence under weak assumptions -- without the need for $L$-smoothness -- and finite termination of Riemannian Armijo line search. Numerical experiments, including an accelerated variant, highlight EG's practical advantages, such as faster convergence compared to RGD based on interior-point geometry.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.