Papers
Topics
Authors
Recent
Search
2000 character limit reached

Nested importance sampling for Bayesian inference: error bounds and the role of dimension

Published 5 Jul 2025 in stat.CO and math.PR | (2507.04163v1)

Abstract: Many Bayesian inference problems involve high dimensional models for which only a subset of the model variables are actual estimation targets. All other variables are just nuisance variables that one would ideally like to integrate out analytically. Unfortunately, such integration is often impossible. However, there are several computational methods that have been proposed over the past 15 years that replace intractable analytical marginalisation by numerical integration, typically using different flavours of importance sampling (IS). Such methods include particle Markov chain Monte Carlo, sequential Monte Carlo squared (SMC$2$), IS$2$, nested particle filters and others. In this paper, we investigate the role of the dimension of the nuisance variables in the error bounds achieved by nested IS methods in Bayesian inference. We prove that, under suitable regularity assumptions on the model, the approximation errors increase at a polynomial (rather than exponential) rate with respect to the dimension of the nuisance variables. Our analysis relies on tools from functional analysis and measure theory and it includes the case of polynomials of degree zero, where the approximation error remains uniformly bounded as the dimension of the nuisance variables increases without bound. We also show how the general analysis can be applied to specific classes of models, including linear and Gaussian settings, models with bounded observation functions, and others. These findings improve our current understanding of when and how IS can overcome the curse of dimensionality in Bayesian inference problems.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.