Papers
Topics
Authors
Recent
Search
2000 character limit reached

Conditional inferential models: combining information for prior-free probabilistic inference

Published 7 Nov 2012 in math.ST, stat.ME, and stat.TH | (1211.1530v5)

Abstract: The inferential model (IM) framework provides valid prior-free probabilistic inference by focusing on predicting unobserved auxiliary variables. But, efficient IM-based inference can be challenging when the auxiliary variable is of higher dimension than the parameter. Here we show that features of the auxiliary variable are often fully observed and, in such cases, a simultaneous dimension reduction and information aggregation can be achieved by conditioning. This proposed conditioning strategy leads to efficient IM inference, and casts new light on Fisher's notions of sufficiency, conditioning, and also Bayesian inference. A differential equation-driven selection of a conditional association is developed, and validity of the conditional IM is proved under some conditions. For problems that do not admit a valid conditional IM of the standard form, we propose a more flexible class of conditional IMs based on localization. Examples of local conditional IMs in a bivariate normal model and a normal variance components model are also given.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.