Papers
Topics
Authors
Recent
Search
2000 character limit reached

Posterior Averaging Information Criterion

Published 19 Sep 2020 in stat.ME, math.ST, and stat.TH | (2009.09248v1)

Abstract: We propose a new model selection method, the posterior averaging information criterion, for Bayesian model assessment from a predictive perspective. The theoretical foundation is built on the Kullback-Leibler divergence to quantify the similarity between the proposed candidate model and the underlying true model. From a Bayesian perspective, our method evaluates the candidate models over the entire posterior distribution in terms of predicting a future independent observation. Without assuming that the true distribution is contained in the candidate models, the new criterion is developed by correcting the asymptotic bias of the posterior mean of the log-likelihood against its expected log-likelihood. It can be generally applied even for Bayesian models with degenerate non-informative prior. The simulation in both normal and binomial settings demonstrates decent small sample performance.

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.