Papers
Topics
Authors
Recent
Search
2000 character limit reached

Sparse Regularized Optimal Transport without Curse of Dimensionality

Published 7 May 2025 in math.ST, math.PR, and stat.TH | (2505.04721v1)

Abstract: Entropic optimal transport -- the optimal transport problem regularized by KL diver-gence -- is highly successful in statistical applications. Thanks to the smoothness of the entropic coupling, its sample complexity avoids the curse of dimensionality suffered by unregularized optimal transport. The flip side of smoothness is overspreading: the entropic coupling always has full support, whereas the unregularized coupling that it approximates is usually sparse, even given by a map. Regularizing optimal transport by less-smooth $f$-divergences such as Tsallis divergence (i.e., $Lp$-regularization) is known to allow for sparse approximations, but is often thought to suffer from the curse of dimensionality as the couplings have limited differentiability and the dual is not strongly concave. We refute this conventional wisdom and show, for a broad family of divergences, that the key empirical quantities converge at the parametric rate, independently of the dimension. More precisely, we provide central limit theorems for the optimal cost, the optimal coupling, and the dual potentials induced by i.i.d.\ samples from the marginals. These results are obtained by a powerful yet elementary approach that is of broader interest for Z-estimation in function classes that are not Donsker.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 1 like about this paper.