Papers
Topics
Authors
Recent
Search
2000 character limit reached

Bootstrap tuning in ordered model selection

Published 17 Jul 2015 in math.ST and stat.TH | (1507.05034v1)

Abstract: In the problem of model selection for a given family of linear estimators, ordered by their variance, we offer a new "smallest accepted" approach motivated by Lepski's method and multiple testing theory. The procedure selects the smallest model which satisfies an acceptance rule based on comparison with all larger models. The method is completely data-driven and does not use any prior information about the variance structure of the noise: its parameters are adjusted to the underlying possibly heterogeneous noise by the so-called "propagation condition" using a wild bootstrap method. The validity of the bootstrap calibration is proved for finite samples with an explicit error bound. We provide a comprehensive theoretical study of the method and describe in detail the set of possible values of the selector ( \hat{m} ). We also establish some precise oracle error bounds for the corresponding estimator ( \hat{\theta} = \tilde{\theta}_{\hat{m}} ) which equally applies to estimation of the whole parameter vectors, some subvector or linear mapping, as well as the estimation of a linear functional.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.