Papers
Topics
Authors
Recent
Search
2000 character limit reached

Convergence Of The Unadjusted Langevin Algorithm For Discontinuous Gradients

Published 4 Dec 2023 in math.PR | (2312.01950v1)

Abstract: We demonstrate that for strongly log-convex densities whose potentials are discontinuous on manifolds, the ULA algorithm converges with stepsize bias of order $1/2$ in Wasserstein-p distance. Our resulting bound is then of the same order as the convergence of ULA for gradient Lipschitz potential. Additionally, we show that so long as the gradient of the potential obeys a growth bound (therefore imposing no regularity condition), the algorithm has stepsize bias of order $1/4$. We therefore unite two active areas of research: i) the study of numerical methods for SDEs with discontinuous coefficients and ii) the study of the non-asymptotic bias of the ULA algorithm (and variants). In particular this is the first result of the former kind we are aware of on an unbounded time interval.

Authors (2)
Citations (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.