Information Critical Phase Overview
- Information Critical Phase is a regime where information-theoretic measures, such as configurational entropy and Fisher information, diverge.
- The concept generalizes traditional phase transitions by incorporating metrics that reveal nontrivial scaling laws in both classical and quantum systems.
- It underpins novel diagnostic tools in fields like machine learning, neural networks, and topological codes, enabling enhanced predictions and metrological precision.
The Information Critical Phase represents an extended region or sharp locus in the space of control parameters (e.g., temperature, coupling strength, noise strength) where a system's information-theoretic descriptors—such as configurational entropy, Fisher information, mutual information, conditional mutual information, or predictive information—become singular, typically diverging or displaying nontrivial scaling behaviors. This concept generalizes the traditional notion of a physical phase transition by incorporating the storage, transmission, and processing of information as central order parameters. Information critical phases arise in classical and quantum systems, equilibrium and nonequilibrium settings, and extend to complex networks, disordered systems, neural ensembles, machine learning models, and topologically ordered phases under decoherence.
1. Configurational Entropy and Critical Phenomena
Configurational entropy (CE), introduced by Gleiser and Sowinski, is a spatial complexity measure defined via the Shannon entropy of the Fourier spectrum of field fluctuations around the ensemble mean. For a scalar field , the CE is
where is the power spectrum of fluctuations (Sowinski et al., 2016, Gleiser et al., 2015).
Key features and scaling:
- quantifies the number of dominant modes in field configuration: high implies broad, noise-like spectra, low indicates compressible, structured configurations.
- In lattice Ginzburg–Landau simulations, displays a sharp global minimum precisely at the critical temperature , where the correlation length diverges, indicating maximal compressibility (i.e., minimal configurational uncertainty).
- Three distinct -space scaling regimes of the configurational-entropy density are observed:
- Scale-free (inactive): (white noise) far from .
- Turbulent regime: , coincident with Kolmogorov turbulence, signifying an inertial cascade of information between scales.
- Critical regime: , with in 2D Ising universality, dominating near criticality.
The sharp minimum of at demonstrates that the critical point not only marks diverging physical correlations but also the maximal capacity for information storage and processing, thus establishing the system in an Information Critical Phase (Sowinski et al., 2016, Gleiser et al., 2015).
2. Scaling, Information Flow, and Nonlocal Correlators
Information-theoretic signatures of criticality extend to measures such as mutual information, conditional mutual information (CMI), and predictive information.
- Mutual Information (MI): In two-dimensional disordered Ising models, the second Renyi MI between spatial subsystems shows finite-size scaling crossings precisely at the critical temperature, indicating order-parameter-free sensitivity to the phase transition (Sriluckshmy et al., 2017).
- Predictive Information: For Markovian processes, the mutual information between a long past and the future, , diverges as at the nonequilibrium critical point, measuring the subextensive information needed to predict future dynamics (Tchernookov et al., 2012).
- CMI and Markov Length: In mixed-state topologically ordered systems (e.g., decohered toric codes), the CMI decays exponentially with buffer width, and the associated Markov length diverges throughout the information critical phase, even while ordinary correlation lengths remain finite (Vijay et al., 26 Dec 2025).
These information measures are not always tied to local order parameters; criticality can manifest as scaling, divergence, or plateaus in non-local information quantities, indicating new universality classes and the need for amended diagnostic tools.
3. Information Critical Phases in Quantum and Non-equilibrium Systems
Quantum generalizations of information criticality appear across several models:
- Fisher Information in Open Quantum Chains: In boundary-driven XXZ spin chains, the quantum Fisher information for the anisotropy diverges superextensively (e.g., at ) within the critical phase (Marzolino et al., 2017). This signals both quantum phase transitions and metrological applicability—superextensive scaling enables Heisenberg-limited parameter estimation.
- Quantum Rabi Triangle: Near multiphase boundaries and triple points, the quantum Fisher information diverges as with universal exponents, and saturates the Heisenberg limit when both photon number and adiabatic time resources are accounted for. The locus of criticality generalizes to a higher-dimensional manifold, yielding a full information critical phase (Tang et al., 3 Nov 2025).
- Decohered Topological Codes: In decohered Toric codes with , an intermediate information critical phase appears, characterized by a diverging Markov length and fractional coherent information plateau, despite finite physical correlation length—an instance of gapless mixed-state order and emergent superfluidity in the dual model (Vijay et al., 26 Dec 2025).
Implications: These results show that quantum information measures may serve as sensitive probes for criticality—even in open, non-equilibrium, or topologically ordered systems—beyond the reach of conventional order parameters.
4. Information Critical Phases in Complex and Disordered Systems
Extended (non-point-like) information critical phases are observed in the following contexts:
- Complex Networks and Percolation: Percolation on nonamenable graphs and small-world networks exhibits a critical phase—a finite parameter interval over which order-parameter fluctuations persist at all scales, and the largest cluster grows subextensively as , (Hasegawa et al., 2014). The fractal exponent and scaling of the cluster-size distribution act as information-theoretic diagnostics, generalizing the notion of a critical point.
- Disordered Systems: Information-theoretic sample-complexity arguments yield universal bounds on correlation length exponents, subsuming the Harris criterion () and extending to Fock-space localization and measurement-induced entanglement transitions (Feldman et al., 2023).
- Random Quantum Circuits: The flow of classical (Holevo) and quantum (coherent) information in dynamically evolving random circuits exhibits a hierarchy of dynamical phase transitions with diverging (spatio-temporal) correlation lengths and universal exponents, corresponding to the flow-through of an information critical phase (Zhuang et al., 2023).
5. Machine Learning and LLMs: Tunable Information Criticality
The concept of an information critical phase extends to machine learning and LLMs:
- LLMs and Temperature-Driven Phase Transition: In GPT-2, as the temperature parameter is varied, statistical observables (e.g., order parameter, susceptibility, correlation length for POS-tag sequences) display singularities and power-law scaling at a critical temperature . At :
- Sequence correlations decay as .
- Susceptibilities and correlation lengths diverge, and relaxation dynamics slow down (critical slowing).
- The system exhibits maximal balance between coherence and diversity—highlighting that practical model performance aligns with information criticality (Nakaishi et al., 2024).
- Information-Concealing Machine Learning: Generic machine-learning protocols using information-concealing neural networks can extract critical exponents by systematically stripping spatial information and examining classifier degradation, demonstrating that information-theoretic observables of neural networks can serve as precise order parameters for complex phase transitions (Guo et al., 2021).
6. Neural Synchrony and Biological Systems
In biological neural systems, criticality in phase synchronisation is operationalized via scale-free fluctuations in the instantaneous rate of change of phase difference and verified by DFA/ML-DFA protocols. The presence of long-range temporal correlations (LRTCs; exponent ) in synchronisation dynamics across Ising, Kuramoto, and empirical brain models demarcates the information critical phase—interpreted as the regime of maximal information transmission capability, neither locked nor desynchronized (Botcharova et al., 2014).
7. Significance and Outlook
The unifying insight of information critical phases is that information-theoretic measures—configurational entropy, Fisher information, mutual/conditional mutual information, predictive information, or statistical complexity—can act as either local or global order parameters, or their generalizations, for a vast range of classical and quantum transitions. This extends the universality and diagnostic power of critical phenomena to contexts where traditional symmetry-breaking approaches fail, including open systems, mixed states, high-dimensional or non-local order, disordered systems, complex networks, and machine learning models.
Open issues include:
- The development of robust, system-independent information measures for phase detection.
- Detailed characterization of scaling exponents and universality classes in non-equilibrium and mixed-state systems, especially in the presence of disorder or topology.
- Practical metrological exploitation of the divergent sensitivity associated with information critical phases.
- Design of optimal decoding and inference protocols in partially decohered or noisy quantum systems leveraging residual critical information protection.
The concept continues to guide both theoretical explorations and practical methodologies at the interface of information theory, statistical mechanics, and data-driven sciences (Sowinski et al., 2016, Gleiser et al., 2015, Vijay et al., 26 Dec 2025, Marzolino et al., 2017, Feldman et al., 2023, Nakaishi et al., 2024, Hasegawa et al., 2014, Botcharova et al., 2014, Zhuang et al., 2023, Guo et al., 2021).