Adaptive Bayesian density estimation using Pitman-Yor or normalized inverse-Gaussian process kernel mixtures
Abstract: We consider Bayesian nonparametric density estimation using a Pitman-Yor or a normalized inverse-Gaussian process kernel mixture as the prior distribution for a density. The procedure is studied from a frequentist perspective. Using the stick-breaking representation of the Pitman-Yor process or the expression of the finite-dimensional distributions for the normalized-inverse Gaussian process, we prove that, when the data are replicates from an infinitely smooth density, the posterior distribution concentrates on any shrinking $Lp$-norm ball, $1\leq p\leq\infty$, around the sampling density at a \emph{nearly parametric} rate, up to a logarithmic factor. The resulting hierarchical Bayesian procedure, with a fixed prior, is thus shown to be adaptive to the infinite degree of smoothness of the sampling density.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.