Information-Theoretic Phase Transitions
- Information-theoretic phase transitions are defined by non-analytic changes in metrics like mutual information, establishing fundamental detectability limits.
- They delineate sharp thresholds and critical behavior in diverse settings, from spiked matrix models to quantum circuits, using advanced replica and variational methods.
- These transitions offer universal diagnostics across equilibrium and non-equilibrium regimes, bridging statistical physics, quantum information, and high-dimensional inference.
Information-theoretic phase transitions denote qualitative changes in the information structure, propagation, or distinguishability of states in a physical, statistical, or computational system, as characterized by rigorous information-theoretic observables. These transitions can occur in a variety of settings: classical and quantum many-body systems, high-dimensional estimation and inference, random graphs, non-Hermitian models, and learning algorithms. Crucially, information-theoretic phase transitions are often decoded by universal statistical measures—mutual information, conditional mutual information, Fisher information, entropic distances, or transfer entropy—which exhibit non-analytic behavior, sometimes even when standard thermodynamic order parameters show no singularities. Recent works integrate replica methods, information geometry, and variational analysis to establish sharp thresholds and critical behavior in both equilibrium and non-equilibrium systems.
1. Fundamental Information-Theoretic Observables and Phase Transitions
Information-theoretic phase transitions are detected by critical behavior in quantities such as mutual information, conditional mutual information (CMI), Fisher information, transfer entropy, configurational entropy, and related entropic or divergence-based metrics. Phenomena include:
- Mutual information: In models of statistical inference (e.g., spiked matrix/tensor models, random graphs), the mutual information per variable displays non-analyticities marking fundamental limits for solution quality, detectability, or reconstructibility (Miolane, 2018, Coja-Oghlan et al., 2016).
- Conditional mutual information (CMI): For mixed-state quantum/classical Markov networks under dissipation, the onset of long-range CMI demarcates an information-theoretic phase boundary—even absent any thermodynamic transition (Zhang et al., 18 Feb 2025).
- Fisher information: The Fisher information (or quantum generalization) often diverges at critical points in statistical mechanics and also bounds the detectability power of machine learning indicators (Arnold et al., 2023).
- Global transfer entropy (GTE): GTE quantifies dynamical causal information flow and serves as a robust signature in both first- and second-order transitions in Potts and Ising models (Brown et al., 2018).
- Configurational entropy: Built from the spatial Fourier spectrum, configurational entropy attains a minimum at criticality, reflecting maximized storage of long-range information (Sowinski et al., 2016).
- Entropy-based sample complexity bounds: Information-theoretic bounds on distances (e.g., total variation, KL divergence) constrain exponents of critical scaling (e.g., Harris criterion) in disordered systems (Feldman et al., 2023).
- Algorithmic information measures: In algorithmic information theory (AIT), the divergence of thermodynamically-inspired partition functions at a critical parameter reflects a computational phase transition (Tadaki, 2013).
These observables always possess clear operational meaning: they bound optimal detection, distinguishability, or learning rates, and quite generally expose the fundamental structure of the physical or statistical phase transition.
2. Information-Theoretic Phase Transitions in Statistical Mechanics
In classical systems, information-theoretic probes such as mutual information between subsystems, global transfer entropy, or configurational entropy provide alternative (often more universal) diagnostics for criticality:
- Second-order (continuous) transitions: Mutual information and pairwise transfer entropy typically peak at the critical temperature. Configurational entropy exhibits a pronounced dip at criticality, indicating maximal spatial compressibility and information storage (Brown et al., 2018, Sowinski et al., 2016).
- First-order transitions: For finite systems exhibiting first-order transitions (Potts model, ), global transfer entropy peaks strictly on the disordered side—the signature of maximal information exchange prior to abrupt ordering. In the thermodynamic limit, this maximum approaches the transition from above (Brown et al., 2018). Entropy measures in tensor network renormalization (e.g., von Neumann entropy of singular value spectra) show sharp maxima at both continuous and first-order boundaries (Gangat et al., 2019).
- Geometric underpinning: In both first- and second-order transitions, information flow (e.g., GTE) and configurational entropy are governed by the geometry of interfaces or the scaling of spatial correlations. Maximal interface length in cluster boundaries aligns with the peak of GTE (Brown et al., 2018); critical scaling laws in the configurational entropy spectrum diagnose the build-up of long-range critical fluctuations (Sowinski et al., 2016).
3. Phase Transitions in High-Dimensional Inference, Learning, and Random Structures
Information-theoretic phase transitions delineate regimes of inferential (im)possibility and computational tractability in high-dimensional estimation, statistical learning, and random-combinatorial models:
- Spiked matrix and tensor models: There are sharp thresholds in the noise/signal strength below which reconstruction is strictly impossible and above which Bayes-optimal inference succeeds. For an intermediate "hard" region, detection is statistically possible but believed to be computationally intractable (e.g., no polynomial-time algorithm attains the MMSE) (Miolane, 2018, Lesieur et al., 2017).
- Random graph models and constraint satisfaction: Mutual information controls the location of condensation and detectability transitions (e.g., in random coloring, stochastic block models). Information-theoretic phase boundaries are determined by variational functionals (e.g., Bethe free energy) and can predict "easy," "hard," and "hybrid-hard" inferential phases (Coja-Oghlan et al., 2016, Ricci-Tersenghi et al., 2018). The tightness of the Kesten–Stigum threshold and the topology of the phase diagram follow from cavity-derived expansions.
- Representational learning (Information Bottleneck, IB): Tuning the IB Lagrange multiplier yields discrete phase transitions—each associated with qualitative changes (bifurcations) in the learned representation. These transitions correspond to the model learning new, orthogonal directions of maximal (nonlinear) correlation between data and label, generalizing canonical correlation analysis to non-Gaussian settings (Wu et al., 2020).
4. Quantum and Non-Hermitian Models: Metric Geometry, Entropic Kinks, and CMI Phases
Quantum systems, especially under open dynamics or non-Hermitian perturbations, present distinctive information-theoretic phase transition signatures:
- Information-geometry of steady states: The Bures–fidelity-induced metric on the parameter manifold of non-equilibrium steady states diverges at dissipative quantum critical points, matching the closure of the Liouvillian gap and scaling of two-point correlators. This metric-based approach enables detection of criticality without prior knowledge of an order parameter (Banchi et al., 2013).
- Non-Hermitian phase transitions: In spin-oscillator systems with non-Hermitian coupling, all entropic measures (Boltzmann, von Neumann, Rényi) constructed from the -inner product exhibit non-analytic behavior (discontinuity in their first derivative) at exceptional points marking the quantum phase transition. Comparison to Ehrenfest's scheme confirms a first-order nature—information-theoretic measures "detect" and "classify" the transition purely from entropic kinks (Das et al., 29 Jun 2025).
- Long-range CMI phases: In decohered Gibbs states of commuting local Hamiltonians, the onset of long-range CMI signals a new, information-theoretic phase, not necessarily accompanied by any conventional symmetry breaking or thermodynamic singularity. Conditional mutual information thus exposes mixed-state correlation structure in both classical and quantum systems, with operational implications for teleportation, decoding, and state compression (Zhang et al., 18 Feb 2025).
5. Information-Theoretic Bounds on Critical Exponents and Universality
The use of information-theoretic metrics yields universal, model-independent constraints on the scaling of physical observables at transition points:
- Critical exponent bounds in disordered systems: For any phase transition detectable by sampling disorder, an information-theoretic argument demands that the correlation/localization-length exponent must satisfy , recovering and sometimes strengthening the classic Harris criterion. Violations of these bounds in finite-size numerics signal preasymptotic behavior, not genuine criticality (Feldman et al., 2023).
- Sample complexity and phase detection: The number of required samples to reliably distinguish neighboring phases is set by reciprocal squares of statistical divergences (e.g., total variation, Hellinger). No algorithm can "beat" this detection threshold, establishing fundamental distinguishability limits rooted in the structure of information distances.
6. Machine Learning Indicators, Fisher Information, and Inference Limits
Machine learning algorithms for phase boundary detection are implicitly lower-bounding the Fisher information of the underlying model:
- ML indicators and Fisher information: Classification/regression-based phase detection techniques yield indicators that are (under ideal conditions) always bounded above by the square root of the Fisher information. Thus, the optimal sensitivity of any data-driven method is set by this fundamental information-theoretic metric (Arnold et al., 2023).
- Practical criteria and failure modes: These bounds clarify why ML indicators peak at conventional second-order transitions (where heat capacity or susceptibility diverge) but fail to precisely locate infinite-order transitions (e.g., BKT). Furthermore, the geometry of phase-detection boundaries in data-driven regimes is strictly governed by the informativeness of the underlying statistical manifold.
7. Dynamical Phase Transitions in Information Flow and Quantum Circuits
Information-theoretic analysis of information transport and scrambling in quantum circuits reveals new classes of dynamical phase transitions:
- Order parameters: Holevo information (for classical bits) and coherent information (for quantum bits) as functions of space-time control parameters exhibit non-analyticities (kinks) whose positions define critical points for the propagation and scrambling of information (Zhuang et al., 2023).
- Universality: Critical exponents (e.g., finite-size scaling exponent ) are independent of circuit details (Clifford or Haar ensemble, classical/quantum input), suggesting new dynamical universality classes in nonequilibrium quantum information transport.
- Physical interpretation: Sharp dynamical transitions separate regimes of localized, ballistically propagating, and globally scrambled information, analogous to entanglement phase transitions in monitored quantum circuits but arising purely from unitary circuit growth and subsystem selection.
8. Outlook and Broader Significance
The study of information-theoretic phase transitions bridges information theory, statistical physics, quantum information, and high-dimensional statistics:
- These transitions provide order-parameter-agnostic, universal diagnostics, revealing fine structure in critical phenomena inaccessible to conventional observables.
- The framework applies to both equilibrium and non-equilibrium dynamics, captures computational hardness, and sharpens understanding of phase boundaries in the inference and learning landscape.
- Information-theoretic metrics establish rigorous bounds on sample complexity, scaling exponents, and the operational detectability of transitions.
- Future developments will tackle correlated disorder, mixed quantum-classical models, and finer classification of both dynamical and learning-induced phase transition universality, leveraging continued synthesis of information geometry, variational methods, and algorithmic analysis.
Key references for the present synthesis include (Brown et al., 2018, Miolane, 2018, Sowinski et al., 2016, Zhang et al., 18 Feb 2025, Maillard et al., 2020, Lesieur et al., 2017, Feldman et al., 2023, Arnold et al., 2023, Banchi et al., 2013, Tadaki, 2013, Gangat et al., 2019, Wu et al., 2020, Coja-Oghlan et al., 2016, Melchert et al., 2012, Ricci-Tersenghi et al., 2018, Reeves et al., 2019, Das et al., 29 Jun 2025), and (Zhuang et al., 2023).