Papers
Topics
Authors
Recent
Search
2000 character limit reached

Minimizing and Maximizing the Shannon Entropy for Fixed Marginals

Published 5 Sep 2025 in math.OC | (2509.05099v1)

Abstract: The mutual information (MI) between two random variables is an important correlation measure in data analysis. The Shannon entropy of a joint probability distribution is the variable part under fixed marginals. We aim to minimize and maximize it to obtain the largest and smallest MI possible in this case, leading to a scaled MI ratio for better comparability. We present algorithmic approaches and optimal solutions for a set of problem instances based on data from molecular evolution. We show that this allows us to construct a sensible, systematic correction to raw MI values.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.