Convergence Of The Unadjusted Langevin Algorithm For Discontinuous Gradients
Abstract: We demonstrate that for strongly log-convex densities whose potentials are discontinuous on manifolds, the ULA algorithm converges with stepsize bias of order $1/2$ in Wasserstein-p distance. Our resulting bound is then of the same order as the convergence of ULA for gradient Lipschitz potential. Additionally, we show that so long as the gradient of the potential obeys a growth bound (therefore imposing no regularity condition), the algorithm has stepsize bias of order $1/4$. We therefore unite two active areas of research: i) the study of numerical methods for SDEs with discontinuous coefficients and ii) the study of the non-asymptotic bias of the ULA algorithm (and variants). In particular this is the first result of the former kind we are aware of on an unbounded time interval.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.