Bayesian Ultrahigh-Dimensional Screening Via MCMC
Abstract: We explore the theoretical and numerical property of a fully Bayesian model selection method in sparse ultrahigh-dimensional settings, i.e., $p\gg n$, where $p$ is the number of covariates and $n$ is the sample size. Our method consists of (1) a hierarchical Bayesian model with a novel prior placed over the model space which includes a hyperparameter $t_n$ controlling the model size, and (2) an efficient MCMC algorithm for automatic and stochastic search of the models. Our theory shows that, when specifying $t_n$ correctly, the proposed method yields selection consistency, i.e., the posterior probability of the true model asymptotically approaches one; when $t_n$ is misspecified, the selected model is still asymptotically nested in the true model. The theory also reveals insensitivity of the selection result with respect to the choice of $t_n$. In implementations, a reasonable prior is further assumed on $t_n$ which allows us to draw its samples stochastically. Our approach conducts selection, estimation and even inference in a unified framework. No additional prescreening or dimension reduction step is needed. Two novel $g$-priors are proposed to make our approach more flexible. A simulation study is given to display the numerical advantage of our method.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.