Papers
Topics
Authors
Recent
Search
2000 character limit reached

Convex optimization over a probability simplex

Published 15 May 2023 in math.OC, cs.LG, cs.NA, math.NA, q-fin.PM, and stat.ML | (2305.09046v2)

Abstract: We propose a new iteration scheme, the Cauchy-Simplex, to optimize convex problems over the probability simplex ${w\in\mathbb{R}n\ |\ \sum_i w_i=1\ \textrm{and}\ w_i\geq0}$. Specifically, we map the simplex to the positive quadrant of a unit sphere, envisage gradient descent in latent variables, and map the result back in a way that only depends on the simplex variable. Moreover, proving rigorous convergence results in this formulation leads inherently to tools from information theory (e.g., cross-entropy and KL divergence). Each iteration of the Cauchy-Simplex consists of simple operations, making it well-suited for high-dimensional problems. In continuous time, we prove that $f(x_T)-f(x*) = {O}(1/T)$ for differentiable real-valued convex functions, where $T$ is the number of time steps and $w*$ is the optimal solution. Numerical experiments of projection onto convex hulls show faster convergence than similar algorithms. Finally, we apply our algorithm to online learning problems and prove the convergence of the average regret for (1) Prediction with expert advice and (2) Universal Portfolios.

Citations (2)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 132 likes about this paper.