Papers
Topics
Authors
Recent
Search
2000 character limit reached

On an estimator achieving the adaptive rate in nonparametric regression under $L^p$-loss for all $1\leq p \leq \infty$

Published 13 Mar 2013 in math.ST, stat.ME, and stat.TH | (1303.3118v2)

Abstract: Consider nonparametric function estimation under $Lp$-loss. The minimax rate for estimation of the regression function over a H\"older ball with smoothness index $\beta$ is $n{-\beta/(2\beta+1)}$ if $1\leq p<\infty$ and $(n/\log n){-\beta/(2\beta+1)}$ if $p=\infty.$ There are many known procedures that either attain this rate for $p=\infty$ but are suboptimal by a $\log n$ factor in the case $p<\infty$ or the other way around. In this article, we construct an estimator that simultaneously achieves the optimal rates under $Lp$-risk for all $1\leq p\leq \infty$ without prior knowledge of $\beta.$ In contrast to classical wavelet thresholding methods that kill small empirical wavelet coefficients and keep large ones, it is essential for simultaneous adaptation that on each resolution level, the largest empirical wavelet coefficients are truncated. This leads to a completely different point of view on wavelet thresholding. The crucial part in the construction of the estimator is the size of the truncation level which is linked to the unknown smoothness index. Although estimation of the smoothness index is known to be a difficult task, there is a data-driven choice of the truncation level that is sufficiently precise for our purpose.

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.