Papers
Topics
Authors
Recent
Search
2000 character limit reached

Block bootstrap optimality for density estimation with dependent data

Published 5 Sep 2019 in math.ST, stat.ME, and stat.TH | (1909.02662v1)

Abstract: Accurate approximation of the sampling distribution of nonparametric kernel density estimators is crucial for many statistical inference problems. Since these estimators have complex asymptotic distributions, bootstrap methods are often used for this purpose. With i.i.d. observations, a large literature exists concerning optimal bootstrap methods which achieve the fastest possible convergence rate of the bootstrap estimator of the sampling distribution of the kernel density estimator. With dependent data, such an optimality theory is an important open problem. We establish a general theory of optimality of the block bootstrap for kernel density estimation under weak dependence assumptions which are satisfied by many important time series models. We propose a unified framework for a theoretical study of a rich class of bootstrap methods which include as special cases subsampling, Kunsch's moving block bootstrap, Hall's under-smoothing (UNS) as well as approaches incorporating no (NBC) or explicit bias correction (EBC). Moreover, we consider their accuracy under a broad spectrum of choices of the bandwidth $h$, which include as an important special case the MSE-optimal choice, as well as other under-smoothed choices. Under each choice of $h$, we derive the optimal tuning parameters and compare optimal performances between the main subclasses (EBC, NBC, UNS) of the bootstrap methods.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.