Papers
Topics
Authors
Recent
Search
2000 character limit reached

Importance Sampling Methods for Bayesian Inference with Partitioned Data

Published 12 Oct 2022 in stat.ME and stat.AP | (2210.06620v2)

Abstract: This article presents new methodology for sample-based Bayesian inference when data are partitioned and communication between the parts is expensive, as arises by necessity in the context of "big data" or by choice in order to take advantage of computational parallelism. The method, which we call the Laplace enriched multiple importance estimator, uses new multiple importance sampling techniques to approximate posterior expectations using samples drawn independently from the local posterior distributions (those conditioned on isolated parts of the data). We construct Laplace approximations from which additional samples can be drawn relatively quickly and improve the methods in high-dimensional estimation. The methods are "embarrassingly parallel", make no restriction on the sampling algorithm (including MCMC) to use or choice of prior distribution, and do not rely on any assumptions about the posterior such as normality. The performance of the methods is demonstrated and compared against some alternatives in experiments with simulated data.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.