Papers
Topics
Authors
Recent
Search
2000 character limit reached

Penalized Estimation in Additive Regression with High-Dimensional Data

Published 24 Apr 2017 in math.ST and stat.TH | (1704.07229v1)

Abstract: Additive regression provides an extension of linear regression by modeling the signal of a response as a sum of functions of covariates of relatively low complexity. We study penalized estimation in high-dimensional nonparametric additive regression where functional semi-norms are used to induce smoothness of component functions and the empirical $L_2$ norm is used to induce sparsity. The functional semi-norms can be of Sobolev or bounded variation types and are allowed to be different amongst individual component functions. We establish new oracle inequalities for the predictive performance of such methods under three simple technical conditions: a sub-gaussian condition on the noise, a compatibility condition on the design and the functional classes under consideration, and an entropy condition on the functional classes. For random designs, the sample compatibility condition can be replaced by its population version under an additional condition to ensure suitable convergence of empirical norms. In homogeneous settings where the complexities of the component functions are of the same order, our results provide a spectrum of explicit convergence rates, from the so-called slow rate without requiring the compatibility condition to the fast rate under the hard sparsity or certain $L_q$ sparsity to allow many small components in the true regression function. These results significantly broadens and sharpens existing ones in the literature.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.