Formulating a thermodynamic limit for trained neural diffusion networks
Formulate a rigorous thermodynamic limit for trained neural diffusion networks that specifies how architectural features—such as locality, receptive field size, sparsity, and equivariance—scale with system size, in order to make precise the effective field-theoretic description of critical behavior in large models.
References
Making this precise requires a careful formulation of the ``thermodynamic limit'' for trained neural networks, including how architectural features scale with system size, an open and technically challenging problem.
— How Out-of-Equilibrium Phase Transitions can Seed Pattern Formation in Trained Diffusion Models
(2603.20092 - Ambrogioni, 20 Mar 2026) in Section: Out-of-Equilibrium Critical Phenomena in Real Deep Architectures