Papers
Topics
Authors
Recent
Search
2000 character limit reached

Concentration Bounds for Discrete Distribution Estimation in KL Divergence

Published 14 Feb 2023 in stat.ML, cs.DM, cs.IT, cs.LG, math.IT, and math.PR | (2302.06869v2)

Abstract: We study the problem of discrete distribution estimation in KL divergence and provide concentration bounds for the Laplace estimator. We show that the deviation from mean scales as $\sqrt{k}/n$ when $n \ge k$, improving upon the best prior result of $k/n$. We also establish a matching lower bound that shows that our bounds are tight up to polylogarithmic factors.

Citations (3)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.