Papers
Topics
Authors
Recent
Search
2000 character limit reached

Information Critical Phase Overview

Updated 2 January 2026
  • Information Critical Phase is a regime where information-theoretic measures, such as configurational entropy and Fisher information, diverge.
  • The concept generalizes traditional phase transitions by incorporating metrics that reveal nontrivial scaling laws in both classical and quantum systems.
  • It underpins novel diagnostic tools in fields like machine learning, neural networks, and topological codes, enabling enhanced predictions and metrological precision.

The Information Critical Phase represents an extended region or sharp locus in the space of control parameters (e.g., temperature, coupling strength, noise strength) where a system's information-theoretic descriptors—such as configurational entropy, Fisher information, mutual information, conditional mutual information, or predictive information—become singular, typically diverging or displaying nontrivial scaling behaviors. This concept generalizes the traditional notion of a physical phase transition by incorporating the storage, transmission, and processing of information as central order parameters. Information critical phases arise in classical and quantum systems, equilibrium and nonequilibrium settings, and extend to complex networks, disordered systems, neural ensembles, machine learning models, and topologically ordered phases under decoherence.

1. Configurational Entropy and Critical Phenomena

Configurational entropy (CE), introduced by Gleiser and Sowinski, is a spatial complexity measure defined via the Shannon entropy of the Fourier spectrum of field fluctuations around the ensemble mean. For a scalar field ϕ(x)\phi(x), the CE is

SC[ϕ]=ddkf(k)lnf(k),f(k)=P(k)ddkP(k)S_C[\phi] = - \int d^d k\, f(k)\,\ln f(k), \qquad f(k) = \frac{P(k)}{\int d^d k' P(k')}

where P(k)=F[Δϕ](k)2P(k) = |\mathcal{F}[\Delta\phi](k)|^2 is the power spectrum of fluctuations Δϕ(x)=ϕ(x)ϕ\Delta\phi(x) = \phi(x) - \langle\phi\rangle (Sowinski et al., 2016, Gleiser et al., 2015).

Key features and scaling:

  • SCS_C quantifies the number of dominant modes in field configuration: high SCS_C implies broad, noise-like spectra, low SCS_C indicates compressible, structured configurations.
  • In lattice Ginzburg–Landau simulations, SC(T)S_C(T) displays a sharp global minimum precisely at the critical temperature TcT_c, where the correlation length diverges, indicating maximal compressibility (i.e., minimal configurational uncertainty).
  • Three distinct kk-space scaling regimes of the configurational-entropy density sC(k)s_C(k) are observed:
    • Scale-free (inactive): sC(k)k0s_C(k) \propto k^0 (white noise) far from TcT_c.
    • Turbulent regime: sC(k)k5/3s_C(k) \propto k^{-5/3}, coincident with Kolmogorov turbulence, signifying an inertial cascade of information between scales.
    • Critical regime: sC(k)kσs_C(k) \propto k^{-\sigma}, with σ=dη=7/4\sigma = d-\eta = 7/4 in 2D Ising universality, dominating near criticality.

The sharp minimum of SCS_C at TcT_c demonstrates that the critical point not only marks diverging physical correlations but also the maximal capacity for information storage and processing, thus establishing the system in an Information Critical Phase (Sowinski et al., 2016, Gleiser et al., 2015).

2. Scaling, Information Flow, and Nonlocal Correlators

Information-theoretic signatures of criticality extend to measures such as mutual information, conditional mutual information (CMI), and predictive information.

  • Mutual Information (MI): In two-dimensional disordered Ising models, the second Renyi MI I2(A,B)I_2(A,B) between spatial subsystems shows finite-size scaling crossings precisely at the critical temperature, indicating order-parameter-free sensitivity to the phase transition (Sriluckshmy et al., 2017).
  • Predictive Information: For Markovian processes, the mutual information between a long past and the future, Ipred(T)I_{\mathrm{pred}}(T), diverges as klogT\sim k\log T at the nonequilibrium critical point, measuring the subextensive information needed to predict future dynamics (Tchernookov et al., 2012).
  • CMI and Markov Length: In mixed-state topologically ordered systems (e.g., decohered ZN\mathbb{Z}_N toric codes), the CMI decays exponentially with buffer width, and the associated Markov length ξM\xi_M diverges throughout the information critical phase, even while ordinary correlation lengths remain finite (Vijay et al., 26 Dec 2025).

These information measures are not always tied to local order parameters; criticality can manifest as scaling, divergence, or plateaus in non-local information quantities, indicating new universality classes and the need for amended diagnostic tools.

3. Information Critical Phases in Quantum and Non-equilibrium Systems

Quantum generalizations of information criticality appear across several models:

  • Fisher Information in Open Quantum Chains: In boundary-driven XXZ spin chains, the quantum Fisher information FΔF_\Delta for the anisotropy Δ\Delta diverges superextensively (e.g., FΔn4F_\Delta \sim n^4 at Δ=1|\Delta|=1) within the critical phase Δ1|\Delta|\leq1 (Marzolino et al., 2017). This signals both quantum phase transitions and metrological applicability—superextensive scaling enables Heisenberg-limited parameter estimation.
  • Quantum Rabi Triangle: Near multiphase boundaries and triple points, the quantum Fisher information I(λ)I(\lambda) diverges as λλcα|\lambda-\lambda_c|^{-\alpha} with universal exponents, and saturates the Heisenberg limit when both photon number and adiabatic time resources are accounted for. The locus of criticality generalizes to a higher-dimensional manifold, yielding a full information critical phase (Tang et al., 3 Nov 2025).
  • Decohered Topological Codes: In decohered ZN\mathbb{Z}_N Toric codes with N>4N>4, an intermediate information critical phase appears, characterized by a diverging Markov length and fractional coherent information plateau, despite finite physical correlation length—an instance of gapless mixed-state order and emergent superfluidity in the dual model (Vijay et al., 26 Dec 2025).

Implications: These results show that quantum information measures may serve as sensitive probes for criticality—even in open, non-equilibrium, or topologically ordered systems—beyond the reach of conventional order parameters.

4. Information Critical Phases in Complex and Disordered Systems

Extended (non-point-like) information critical phases are observed in the following contexts:

  • Complex Networks and Percolation: Percolation on nonamenable graphs and small-world networks exhibits a critical phase—a finite parameter interval over which order-parameter fluctuations persist at all scales, and the largest cluster grows subextensively as Nψ(p)N^{\psi(p)}, 0<ψ(p)<10<\psi(p)<1 (Hasegawa et al., 2014). The fractal exponent ψ\psi and scaling of the cluster-size distribution act as information-theoretic diagnostics, generalizing the notion of a critical point.
  • Disordered Systems: Information-theoretic sample-complexity arguments yield universal bounds on correlation length exponents, subsuming the Harris criterion (ν2/d\nu \ge 2/d) and extending to Fock-space localization and measurement-induced entanglement transitions (Feldman et al., 2023).
  • Random Quantum Circuits: The flow of classical (Holevo) and quantum (coherent) information in dynamically evolving random circuits exhibits a hierarchy of dynamical phase transitions with diverging (spatio-temporal) correlation lengths and universal exponents, corresponding to the flow-through of an information critical phase (Zhuang et al., 2023).

5. Machine Learning and LLMs: Tunable Information Criticality

The concept of an information critical phase extends to machine learning and LLMs:

  • LLMs and Temperature-Driven Phase Transition: In GPT-2, as the temperature parameter TT is varied, statistical observables (e.g., order parameter, susceptibility, correlation length for POS-tag sequences) display singularities and power-law scaling at a critical temperature Tc1T_c\approx1. At TcT_c:
    • Sequence correlations decay as C(r)rηC(r) \sim r^{-\eta}.
    • Susceptibilities and correlation lengths diverge, and relaxation dynamics slow down (critical slowing).
    • The system exhibits maximal balance between coherence and diversity—highlighting that practical model performance aligns with information criticality (Nakaishi et al., 2024).
  • Information-Concealing Machine Learning: Generic machine-learning protocols using information-concealing neural networks can extract critical exponents by systematically stripping spatial information and examining classifier degradation, demonstrating that information-theoretic observables of neural networks can serve as precise order parameters for complex phase transitions (Guo et al., 2021).

6. Neural Synchrony and Biological Systems

In biological neural systems, criticality in phase synchronisation is operationalized via scale-free fluctuations in the instantaneous rate of change of phase difference and verified by DFA/ML-DFA protocols. The presence of long-range temporal correlations (LRTCs; exponent H>0.5H>0.5) in synchronisation dynamics across Ising, Kuramoto, and empirical brain models demarcates the information critical phase—interpreted as the regime of maximal information transmission capability, neither locked nor desynchronized (Botcharova et al., 2014).

7. Significance and Outlook

The unifying insight of information critical phases is that information-theoretic measures—configurational entropy, Fisher information, mutual/conditional mutual information, predictive information, or statistical complexity—can act as either local or global order parameters, or their generalizations, for a vast range of classical and quantum transitions. This extends the universality and diagnostic power of critical phenomena to contexts where traditional symmetry-breaking approaches fail, including open systems, mixed states, high-dimensional or non-local order, disordered systems, complex networks, and machine learning models.

Open issues include:

  • The development of robust, system-independent information measures for phase detection.
  • Detailed characterization of scaling exponents and universality classes in non-equilibrium and mixed-state systems, especially in the presence of disorder or topology.
  • Practical metrological exploitation of the divergent sensitivity associated with information critical phases.
  • Design of optimal decoding and inference protocols in partially decohered or noisy quantum systems leveraging residual critical information protection.

The concept continues to guide both theoretical explorations and practical methodologies at the interface of information theory, statistical mechanics, and data-driven sciences (Sowinski et al., 2016, Gleiser et al., 2015, Vijay et al., 26 Dec 2025, Marzolino et al., 2017, Feldman et al., 2023, Nakaishi et al., 2024, Hasegawa et al., 2014, Botcharova et al., 2014, Zhuang et al., 2023, Guo et al., 2021).

Topic to Video (Beta)

No one has generated a video about this topic yet.

Whiteboard

No one has generated a whiteboard explanation for this topic yet.

Follow Topic

Get notified by email when new papers are published related to Information Critical Phase.