Papers
Topics
Authors
Recent
Search
2000 character limit reached

$\ell_1$-Regularized Generalized Least Squares

Published 17 May 2024 in stat.ME, math.ST, stat.ML, and stat.TH | (2405.10719v1)

Abstract: In this paper we propose an $\ell_1$-regularized GLS estimator for high-dimensional regressions with potentially autocorrelated errors. We establish non-asymptotic oracle inequalities for estimation accuracy in a framework that allows for highly persistent autoregressive errors. In practice, the Whitening matrix required to implement the GLS is unkown, we present a feasible estimator for this matrix, derive consistency results and ultimately show how our proposed feasible GLS can recover closely the optimal performance (as if the errors were a white noise) of the LASSO. A simulation study verifies the performance of the proposed method, demonstrating that the penalized (feasible) GLS-LASSO estimator performs on par with the LASSO in the case of white noise errors, whilst outperforming it in terms of sign-recovery and estimation error when the errors exhibit significant correlation.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (23)
  1. Regularized estimation in sparse high-dimensional time series models. The Annals of Statistics, 43(4):1535–1567.
  2. Exponential inequalities for self-normalized martingales with applications. The Annals of Applied Probability, 18(5):1848–1869.
  3. Some comments on a paper of coen, gomme and kendall. Journal of the Royal Statistical Society Series A: Statistics in Society, 134(2):229–240.
  4. Statistics for high-dimensional data: methods, theory and applications. Springer Science & Business Media.
  5. High dimensional generalised penalised least squares. arXiv.
  6. Granger, C. W. (1981). Some properties of time series data and their use in econometric model specification. Journal of econometrics, 16(1):121–130.
  7. Spurious regressions in econometrics. Journal of econometrics, 2(2):111–120.
  8. The elements of statistical learning: data mining, inference, and prediction, volume 2. Springer.
  9. Preconditioning the Lasso for sign consistency. Electronic Journal of Statistics, 9:1150–1172.
  10. Generalized Least Squares, volume 7. Wiley.
  11. Kock, A. B. (2014). Consistent and Conservative Model Selection with the Adaptive Lasso in Stationary and Nonstationary Autoregressions. Econometric Theory, 32(1):243–259.
  12. Generalized Least Squares with Misspecified Serial Correlation Structures. Journal of the Royal Statistical Society Series B, 63(3):515–531.
  13. ℓ⁢1ℓ1\ell 1roman_ℓ 1-regularization of high-dimensional time-series models with non-Gaussian and heteroskedastic errors. Journal of Econometrics, 191(1):255–271.
  14. A Unified Framework for High-Dimensional Analysis of M-Estimators with Decomposable Regularizers. Statistical Science, 27(4):538–557.
  15. Self-normalized processes: Limit theory and Statistical Applications. Springer.
  16. Phillips, P. C. (1987). Towards a unified asymptotic theory for autoregression. Biometrika, 74(3):535–547.
  17. Restricted eigenvalue properties for correlated gaussian designs. The Journal of Machine Learning …, 11:2241–2259.
  18. Vershynin, R. (2018). High-dimensional probability: An introduction with applications in data science, volume 47. Cambridge university press.
  19. The adaptive lasso in high-dimensional sparse heteroscedastic models. Mathematical Methods of Statistics, 22(2):137–154.
  20. Wainwright, M. J. (2009). Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using l1-Constrained Quadratic Programming (Lasso). Information Theory, IEEE Transactions on, 55(5):2183–2202.
  21. Wainwright, M. J. (2019). High-dimensional statistics: A non-asymptotic viewpoint, volume 48. Cambridge University Press.
  22. White, H. (2014). Asymptotic theory for econometricians. Academic press.
  23. Ziel, F. (2016). Iteratively reweighted adaptive lasso for conditional heteroscedastic time series with applications to AR-ARCH type processes. Computational Statistics and Data Analysis, 100:773–793.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We found no open problems mentioned in this paper.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 3 tweets with 24 likes about this paper.