A Local Characterization of $f$-Divergences Yielding PSD Mutual-Information Matrices
Abstract: We study when the variable-indexed matrix of pairwise (f)-mutual informations (M{(f)}_{ij}=I_f(X_i;X_j)) is positive semidefinite (PSD). Let (f:(0,\infty)\to\mathbb{R}) be convex with (f(1)=0), finite in a neighborhood of (1), and with (f(0)<\infty) so that diagonal terms are finite. We give a sharp \emph{local} characterization around independence: there exists (δ=δ(f)>0) such that for every (n) and every finite-alphabet family ((X_1,\ldots,X_n)) whose pairwise joint-to-product ratios lie in ((1-δ,1+δ)), the matrix (M{(f)}) is PSD if and only if (f) is analytic at (1) with a convergent expansion (f(t)=\sum_{m=2}{\infty} a_m (t-1)m) and (a_m\ge 0) on a neighborhood of (1). Consequently, any negative Taylor coefficient yields an explicit finite-alphabet counterexample under arbitrarily weak dependence, and non-analytic convex divergences (e.g.\ total variation) are excluded. This PSD requirement is distinct from Hilbertian/metric properties of divergences between distributions (e.g.\ (\sqrt{\mathrm{JS}})): we study PSD of the \emph{variable-indexed} mutual-information matrix. The proof combines a replica embedding that turns monomial terms into Gram matrices with a replica-forcing reduction to positive-definite dot-product kernels, enabling an application of the Schoenberg--Berg--Christensen--Ressel classification.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.