Papers
Topics
Authors
Recent
Search
2000 character limit reached

Estimating the variance of Shannon entropy

Published 29 Apr 2021 in cs.IT and math.IT | (2105.12829v2)

Abstract: The statistical analysis of data stemming from dynamical systems, including, but not limited to, time series, routinely relies on the estimation of information theoretical quantities, most notably Shannon entropy. To this purpose, possibly the most widespread tool is provided by the so-called plug-in estimator, whose statistical properties in terms of bias and variance were investigated since the first decade after the publication of Shannon's seminal works. In the case of an underlying multinomial distribution, while the bias can be evaluated by knowing support and data set size, variance is far more elusive. The aim of the present work is to investigate, in the multinomial case, the statistical properties of an estimator of a parameter that describes the variance of the plug-in estimator of Shannon entropy. We then exactly determine the probability distributions that maximize that parameter. The results presented here allow to set upper limits to the uncertainty of entropy assessments under the hypothesis of memoryless underlying stochastic processes.

Citations (10)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.