Latent-Space Mean-Field Theory for Deep BitNet-like Training: Constrained Gradient Flows with Smooth Quantization and STE Limits
Abstract: This work develops a mean-field analysis for the asymptotic behavior of deep BitNet-like architectures as smooth quantization parameters approach zero. We establish that empirical measures of latent weights converge weakly to solutions of constrained continuity equations under vanishing quantization smoothing. Our main theoretical contribution demonstrates that the natural exponential decay in smooth quantization cancels out apparent singularities, yielding uniform bounds on mean-field dynamics independent of smoothing parameters. Under standard regularity assumptions, we prove convergence to a well-defined limit that provides the mathematical foundation for gradient-based training of quantized neural networks through distributional analysis.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.