Papers
Topics
Authors
Recent
Search
2000 character limit reached

Generalization bounds for score-based generative models: a synthetic proof

Published 7 Jul 2025 in math.ST and stat.TH | (2507.04794v1)

Abstract: We establish minimax convergence rates for score-based generative models (SGMs) under the $1$-Wasserstein distance. Assuming the target density $p\star$ lies in a nonparametric $\beta$-smooth H\"older class with either compact support or subGaussian tails on $\mathbb{R}d$, we prove that neural network-based score estimators trained via denoising score matching yield generative models achieving rate $n{-(\beta+1)/(2\beta+d)}$ up to polylogarithmic factors. Our unified analysis handles arbitrary smoothness $\beta > 0$, supports both deterministic and stochastic samplers, and leverages shape constraints on $p\star$ to induce regularity of the score. The resulting proofs are more concise, and grounded in generic stability of diffusions and standard approximation theory.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.