2000 character limit reached
Minimizing and Maximizing the Shannon Entropy for Fixed Marginals
Published 5 Sep 2025 in math.OC | (2509.05099v1)
Abstract: The mutual information (MI) between two random variables is an important correlation measure in data analysis. The Shannon entropy of a joint probability distribution is the variable part under fixed marginals. We aim to minimize and maximize it to obtain the largest and smallest MI possible in this case, leading to a scaled MI ratio for better comparability. We present algorithmic approaches and optimal solutions for a set of problem instances based on data from molecular evolution. We show that this allows us to construct a sensible, systematic correction to raw MI values.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.