High-Dimensional Sparse Single-Index Regression Via Hilbert-Schmidt Independence Criterion
Abstract: Hilbert-Schmidt Independence Criterion (HSIC) has recently been used in the field of single-index models to estimate the directions. Compared with some other well-established methods, it requires relatively weaker conditions. However, its performance has not yet been studied in the high-dimensional scenario, where the number of covariates is much larger than the sample size. In this article, we propose a new efficient sparse estimate in HSIC based single-index model. This new method estimates the subspace spanned by the linear combinations of the covariates directly and performs variable selection simultaneously. Due to the non-convexity of the objective function, we use a majorize-minimize approach together with the linearized alternating direction method of multipliers algorithm to solve the optimization problem. The algorithm does not involve the inverse of the covariance matrix and therefore can handle the large p small n scenario naturally. Through extensive simulation studies and a real data analysis, we show our proposal is efficient and effective in the high-dimensional setting. The Matlab codes for this method are available online.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.