Entropic Conjugation Framework
- Entropic conjugation is a duality operator that links joint entropies, unifying metrics such as total correlation, dual total correlation, and others.
- It decomposes high-order dependencies into symmetric (total interdependence) and skew-symmetric (redundancy vs. synergy) components using an averaged conditional mutual information basis.
- Applications in spin systems, neuroscience, and statistical mechanics illustrate its potential for uncovering new metric directions, especially in larger systems (n > 5).
The entropic conjugation framework is a foundational mathematical structure for analyzing high-order interdependencies in collections of random variables, unifying a broad family of information-theoretic metrics and elucidating their structural relations. The framework centers on a canonical duality—entropic conjugation—over functionals representing multivariate joint entropies, yielding a principled taxonomy of high-order informational quantities characterized by symmetry and skew-symmetry. The approach is applicable across diverse domains, including statistical mechanics, neuroscience, and complex systems, where distinguishing low- and high-order effects is central.
1. Formal Definition of Entropic Conjugation
Let denote random variables indexed by , with joint entropies for every subset . The entropic conjugation operator acts on single-term entropies as
where is the complement of in . Linearity extends to any functional expressible as , so
For conditional mutual informations,
This establishes a concrete involutive transformation within the space of entropy-based metrics.
2. Basis and Duality in High-Order Measures
A central role is played by the averaged conditional mutual information basis
Entropic conjugation yields a duality: The proof leverages average entropy coefficients and their transformation properties under conjugation, together with the identity . This establishes a rigorous mapping between metrics targeting low-order and high-order interdependencies.
3. Mathematical Properties: Linearity, Involution, Symmetry Structures
The entropic conjugation operator is linear on any entropy combination. It is involutive: recovers the identity, i.e.,
Symmetry properties are defined for any functional :
- Symmetric if
- Skew-symmetric if Expressed via the basis (Te Sun Han’s decomposition) as , symmetry corresponds to and skew-symmetry to . These properties undergird the classification of all entropy-driven high-order metrics.
4. Unification of Multivariate Information-Theoretic Metrics
All label-symmetric, dependency-zero functionals are representable as . Notable metrics include:
- Total correlation (TC): , with
- Dual total correlation (DTC): , with
- S-information (): , with (symmetric)
- TSE complexity: , with (symmetric)
- O-information (): , with (skew-symmetric)
- Interaction information (II): , with
TC and DTC form a conjugate pair; and TSE are symmetric measures of total interdependence; is the unique linear skew-symmetric measure; II alternates symmetry class with . This unifying structure facilitates systematic selection and interpretation of high-order informational metrics.
5. Dimensional Analysis and Gaps in Metric Space
The vector space of all labelling-symmetric, dependency-zero functionals has dimension . The entropic conjugation operator partitions into symmetric and skew-symmetric directions. For , the cited six metrics span all possible directions:
- : (symmetric), (skew-symmetric)
- : symmetric ; skew
- : symmetric ; skew
For , additional directions emerge, representing an unexplored class of high-order metrics. Computational efficiency imposes further constraints: among symmetric and skew-symmetric metrics computable in entropies, only and satisfy this criterion.
6. Practical Implications and Illustrative Applications
The entropic conjugation framework enables any high-order measure to be decomposed into a symmetric part (overall interdependence strength) and a skew-symmetric part (balance of low- vs. high-order effects). In practical terms, symmetric measures track total interdependence, while skew-symmetric measures distinguish synergistic versus redundant regimes.
Applied to spin systems (), where are distributed under Boltzmann statistics with pairwise couplings , analysis of via principal component analysis yields:
- The first principal component (correlated with ) represents total interdependence strength.
- The second principal component (aligned with ) reflects the relative influence of low- and high-order interactions, separating ferromagnetic (), weak (), and frustrated () regimes.
This decomposition provides immediate guidance: is optimal for balanced total interdependence analysis; is optimal for balanced redundancy-versus-synergy analysis; their combination delivers rich profiling for complex systems.
7. Impact, Limitations, and Outlook
Entropic conjugation establishes a rigorous duality for Shannon-based high-order metrics, systematically exchanges low and high-order effects, and classifies all symmetric and skew-symmetric combinations up to moderate . Existing measures are positioned within this taxonomy, revealing unexplored metric directions for and informing computational feasibility. A plausible implication is that new high-order metrics—beyond those explicitly characterized—remain to be discovered for large systems. The framework provides both theoretical and practical roadmaps for the analysis of multivariate dependencies in complex physical and biological systems (Rosas et al., 2024).