Enumerable Distributions, Randomness, Dependence
Abstract: Mutual information I in infinite sequences (and in their finite prefixes) is essential in theoretical analysis of many situations. Yet its right definition has been elusive for a long time. I address it by generalizing Kolmogorov Complexity theory from measures to SEMImeasures i.e, infimums of sets of measures. Being concave rather than linear functionals, semimeasures are quite delicate to handle. Yet, they adequately grasp various theoretical and practical scenaria. A simple lower bound i$(\alpha:\beta) = \sup\,_{x\in N}\,(K(x) - K(x|\alpha) - K(x|\beta)) $ for information turns out tight for Martin-Lof random $ \alpha,\beta $. For all sequences I$(\alpha:\beta) $ is characterized by the minimum of i$(\alpha':\beta') $ over random $ \alpha',\beta' $ with $ U(\alpha')=\alpha, U(\beta')=\beta $.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.