2000 character limit reached
Concentration Bounds for Discrete Distribution Estimation in KL Divergence
Published 14 Feb 2023 in stat.ML, cs.DM, cs.IT, cs.LG, math.IT, and math.PR | (2302.06869v2)
Abstract: We study the problem of discrete distribution estimation in KL divergence and provide concentration bounds for the Laplace estimator. We show that the deviation from mean scales as $\sqrt{k}/n$ when $n \ge k$, improving upon the best prior result of $k/n$. We also establish a matching lower bound that shows that our bounds are tight up to polylogarithmic factors.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.