2000 character limit reached
Shannon's entropy revisited
Published 18 Mar 2015 in cs.IT and math.IT | (1504.01407v1)
Abstract: I consider the effect of a finite sample size on the entropy of a sample of independent events. I propose formula for entropy which satisfies Shannon's axioms, and which reduces to Shannon's entropy when sample size is infinite. I discuss the physical meaning of the difference between two formulas, including some practical implications, such as maximum achievable channel utilization, and minimum achievable communication protocol overhead, for a given message size.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.