Papers
Topics
Authors
Recent
Search
2000 character limit reached

The many faces of multivariate information

Published 12 Jan 2026 in cs.IT | (2601.08030v1)

Abstract: Extracting higher-order structures from multivariate data has become an area of intensive study in complex systems science, as these multipartite interactions can reveal insights into fundamental features of complex systems like emergent phenomena. Information theory provides a natural language for exploring these interactions, as it elegantly formalizes the problem of comparing wholes" andparts" using joint, conditional, and marginal entropies. A large number of distinct statistics have been developed over the years, all aiming to capture different aspects of ``higher-order" information sharing. Here, we show that three of them (the dual total correlation, S-information, and O-information) are special cases of a more general function, $Δ{k}$ which is parameterized by a free parameter $k$. For different values of $k$, we recover different measures: $Δ{0}$ is equal to the S-information, $Δ{1}$ is equal to the dual total correlation, and $Δ{2}$ is equal to the negative O-information. Generally, the $Δ{k}$ function is arranged into a hierarchy of increasingly high-order synergies; for a given value of $k$, if $Δ{k}>0$, then the system is dominated by interactions with order greater than $k$, while if $Δ{k}<0$, then the system is dominated by interactions with order lower than $k$. $Δ{k}=0$ if the system is composed entirely of synergies of order-k. Using the entropic conjugation framework, we also find that the conjugate of $Δ{k}$, which we term $Γ{k}$ is arranged into a similar hierarchy of increasingly high-order redundancies. These results provide new insights into the nature of both higher-order redundant and synergistic interactions, and helps unify the existing zoo of measures into a more coherent structure.

Summary

  • The paper introduces the Δk framework as a unifying parameterized function that subsumes S-information, dual total correlation, and O-information.
  • It demonstrates how Δk and its entropic conjugate Γk quantitatively differentiate high-order synergy and redundancy in system interactions.
  • The framework's additivity and combinatorial properties offer a robust basis for multi-scale analysis across diverse complex systems.

Unifying Multivariate Information Measures through the Δk\Delta^{k} Framework

Overview

The paper "The many faces of multivariate information" (2601.08030) addresses the fragmentation in the literature on multivariate extensions of mutual information for complex systems analysis. It systematically demonstrates that widely-used higher-order statistics—S-information, dual total correlation, and O-information—are instances of a single, parametrized function, Δk\Delta^{k}, with the parameter kk controlling the sensitivity to interdependency order. It further formalizes how these measures fit into hierarchies quantifying the balance between high-order redundancy and synergy, and introduces their entropic conjugates, leading to a dual perspective on redundancy structure.

Multivariate Information Measures

Traditional Shannon information measures (Shannon entropy, mutual information) only capture bivariate relationships, which are often insufficient for characterizing the higher-order dependencies fundamental to emergent phenomena in complex systems. Generalizing these to the multivariate setting has produced a variety of alternative measures, each said to isolate distinct "kinds" of dependency.

  • Total Correlation (TT): Measures overall statistical dependence (informally, redundancy) among variables.
  • Dual Total Correlation (DD): Captures global synergistic dependencies among variables.
  • S-information (SS): Represents the sum T+DT+D, interpreted as “total” multivariate dependency.
  • O-information (OO): A signed measure TDT-D, operationalizing the redundancy/synergy dominance balance in systems.

Each quantifies distinct aspects of multipartite dependency structure, but without a general, unified formalism their relationships remain obscure, complicating both interpretation and application across fields.

The General Δk\Delta^{k} Hierarchy

The key contribution is the identification of a general function parameterized by kk:

Δk(X)=(Nk)T(X)i=1NT(Xi)\Delta^k(X) = (N-k)T(X) - \sum_{i=1}^N T(X^{-i})

where NN is the number of variables, and XiX^{-i} denotes XX with variable ii removed. The parameter kk determines the interaction order probed:

  • k=0k=0: S-information SS,
  • k=1k=1: Dual total correlation DD,
  • k=2k=2: Negative O-information O-O.

Thus, S-information, dual total correlation, and O-information are not fundamentally distinct, but are instead members of a one-parameter hierarchy. This establishes a principle by which higher-order measures can be systematically constructed and related.

The interpretation of Δk\Delta^k is as a "whole-minus-margin" statistic, quantifying how much multi-way dependence remains after accounting for all (NN) leave-one-out marginals, with the scaling of the system-wide (whole) term set by kk. The sign and magnitude of Δk\Delta^k reflects the system's structure: positive values indicate dominance by interactions of order greater than kk, negative values indicate dominance by interactions of lower than kk, and Δk=0\Delta^k=0 signals pure kk-order dependency.

Entropic Conjugation and the Γk\Gamma^k Hierarchy

By leveraging the entropic conjugation operation [rosas_characterising_2025], the author constructs the conjugate hierarchy:

Γk(X)=S(X)kD(X)\Gamma^k(X) = S(X) - k D(X)

where Γ0=S,Γ1=T,Γ2=O\Gamma^0 = S, \Gamma^1 = T, \Gamma^2 = O, yielding a complementary spectrum focused on redundancy rather than synergy. The Γk\Gamma^k hierarchy mirrors Δk\Delta^k's sensitivity to interaction order, but for redundant structure. If a system is composed solely of kk-order redundancies (i.e., “giant bits”), then Γk=0\Gamma^k = 0; for dominance by higher-order redundancies, Γk>0\Gamma^k > 0.

Both hierarchies are strictly additive over independent subsystems, guaranteeing correct extensivity properties crucial for analysis of large systems.

Implications and Extensions

This unifying perspective carries multiple implications:

  • Conceptual unification: By embedding established measures as special cases in the Δk\Delta^k and Γk\Gamma^k hierarchies, it dissolves the ad hoc justification for separate measures, and clarifies which types of dependency each one characterizes.
  • Order resolution: The Δk\Delta^k and Γk\Gamma^k hierarchies allow for operational questions about the dominant order of synergy or redundancy—e.g., by seeking the maximal kk such that Δk>0\Delta^k > 0, one heuristically estimates the intrinsic dimensionality or minimal description size of the system's synergistic structure.
  • Extension to new statistics: The formalism is combinatorial, depending only on basic properties of the chosen dependency measure (non-negativity, monotonicity under marginalization, fragility under single-element removal). Thus, it is adaptable to other contexts—e.g., graph-theoretic metrics, as discussed in [varley_scalable_2024], or topological invariants [varley_topology_2025].
  • Links to causality and computation: The O-information, and by extension, Δk\Delta^k statistics, map onto the structure of causal colliders, revealing applications in causal inference and multivariate computation [varley_synergistic_2024].

Technical Claims

  • Additivity: Both Δk\Delta^k and Γk\Gamma^k are additive over independent subsystems.
  • Interaction purity: For systems exhibiting only pure kk-order synergy or redundancy, Δk\Delta^k (or Γk\Gamma^k) evaluates to zero, and its sign diagnoses the dominance of higher- or lower-order interdependencies.
  • Combinatorial interpretation: The parameter kk directly corresponds to survival counts of kk-order interactions under leave-one-out marginalization.

Future Directions

The general framework suggests several avenues for further investigation:

  • Alternative dependency functions: Provided certain axioms are satisfied (non-negativity, monotonicity, fragility), other dependency measures may be substituted for total correlation, enabling generalizations to yet-unexplored domains.
  • Hierarchical analysis: The Δk\Delta^k and Γk\Gamma^k functions facilitate hierarchical decompositions of dependency structure, with potential benefits for multi-scale modeling and dimensionality reduction.
  • Complex system phenomenology: By systematically quantifying the balance and order of synergies and redundancies, the hierarchy enables empirical investigation (in neuroscience, physiology, network biology, etc.) into the link between interaction structure and emergent system properties.

Conclusion

This paper provides a rigorous organizing framework connecting previously disparate multivariate information measures through the parameterized Δk\Delta^k (and conjugate Γk\Gamma^k) family. This clarity facilitates targeted analysis of higher-order structure in complex systems, operational resolution of dominating interaction orders, and suggests broad paths for extension beyond the classical entropic context. The hierarchy represents a principled toolkit for future multiscale analysis and a foundation for advances in both theoretical and applied aspects of complex systems research.

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 18 tweets with 59 likes about this paper.