Papers
Topics
Authors
Recent
Search
2000 character limit reached

On information gain, Kullback-Leibler divergence, entropy production and the involution kernel

Published 4 Mar 2020 in math.DS, cond-mat.stat-mech, math-ph, math.MP, and math.PR | (2003.02030v2)

Abstract: It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, which extends the concept of Shannon entropy, plays a fundamental role. Given an {\it a priori} probability kernel $\hat{\nu}$ and a probability $\pi$ on the measurable space $X\times Y$ we consider an appropriate definition of entropy of $\pi$ relative to $\hat{\nu}$, which is based on previous works. Using this concept of entropy we obtain a natural definition of information gain for general measurable spaces which coincides with the mutual information given from the K-L divergence in the case $\hat{\nu}$ is identified with a probability $\nu$ on $X$. This will be used to extend the meaning of specific information gain and dynamical entropy production to the model of thermodynamic formalism for symbolic dynamics over a compact alphabet (TFCA model). In this case, we show that the involution kernel is a natural tool for better understanding some important properties of entropy production.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.