Beyond Low Rank: Fast Low-Rank + Diagonal Decomposition with a Spectral Approach
Abstract: Low-rank plus diagonal (LRPD) decompositions provide a powerful structural model for large covariance matrices, simultaneously capturing global shared factors and localized corrections that arise in covariance estimation, factor analysis, and large-scale kernel learning. We introduce an alternating low-rank then diagonal (Alt) algorithm that provably reduces approximation error and significantly outperforms gradient descent while remaining cheaper than majorization-minimization methods~\cite{sun2016majorization}. To scale to large matrices, we develop a randomized LRPD variant that combines fixed-rank Nystrom sketching~\cite{tropp2017fixed} for the low-rank component with Diag++ stochastic diagonal estimation~\cite{baston2022stochastic}. This hybrid algorithm achieves machine precision decomposition error using a number of matrix-vector products far smaller than the ambient dimension, and comes with rigorous non-asymptotic error bounds. On synthetic data, it exactly recovers LRPD structured matrices with high efficiency, and on real-world S&P 500 stock return covariances, where the spectrum decays slowly and strong sector structure exists, it achieves substantially lower error than pure low-rank approximations.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.