Papers
Topics
Authors
Recent
Search
2000 character limit reached

Tight Bounds on the Laplace Approximation Accuracy in High Dimensions

Published 28 May 2023 in math.ST and stat.TH | (2305.17604v4)

Abstract: In Bayesian inference, a widespread technique to compute integrals against a high-dimensional posterior is to use a Gaussian proxy to the posterior known as the Laplace approximation. We address the question of accuracy of the approximation in terms of TV distance, in the regime in which dimension $d$ grows with sample size $n$. Multiple prior works have shown the requirement $d3\ll n$ is sufficient for accuracy of the approximation. But in a recent breakthrough, Kasprzak et al, 2022 derived an upper bound scaling as $d/\sqrt n$. In this work, we further refine our understanding of the Laplace approximation error by decomposing the TV error into an $O(d/\sqrt n)$ leading order term, and an $O(d2/n)$ remainder. This decomposition has far reaching implications: first, we use it to prove that the requirement $d2\ll n$ cannot in general be improved by showing TV$\gtrsim d/\sqrt n$ for a posterior stemming from logistic regression with Gaussian design. Second, the decomposition provides tighter and more easily computable upper bounds on the TV error. Our result also opens the door to proving the BvM in the $d2\ll n$ regime, and correcting the Laplace approximation to account for skew; this is pursued in two follow-up works.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.