2000 character limit reached
A remark on conditional entropy
Published 26 Mar 2024 in cs.IT and math.IT | (2404.02167v1)
Abstract: The following note proves that conditional entropy of a sequence is almost time-reversal invariant, specifically they only differ by a small constant factor dependent only upon the forward and backward models that the entropies are being calculated with respect to. This gives rise to a numerical value that quantifies learnability, as well as a methodology to control for distributional shift between datasets. Rough guidelines are given for practitioners.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.