Papers
Topics
Authors
Recent
Search
2000 character limit reached

Scaling Gaussian Processes for Learning Curve Prediction via Latent Kronecker Structure

Published 11 Oct 2024 in cs.LG and stat.ML | (2410.09239v1)

Abstract: A key task in AutoML is to model learning curves of machine learning models jointly as a function of model hyper-parameters and training progression. While Gaussian processes (GPs) are suitable for this task, na\"ive GPs require $\mathcal{O}(n3m3)$ time and $\mathcal{O}(n2 m2)$ space for $n$ hyper-parameter configurations and $\mathcal{O}(m)$ learning curve observations per hyper-parameter. Efficient inference via Kronecker structure is typically incompatible with early-stopping due to missing learning curve values. We impose $\textit{latent Kronecker structure}$ to leverage efficient product kernels while handling missing values. In particular, we interpret the joint covariance matrix of observed values as the projection of a latent Kronecker product. Combined with iterative linear solvers and structured matrix-vector multiplication, our method only requires $\mathcal{O}(n3 + m3)$ time and $\mathcal{O}(n2 + m2)$ space. We show that our GP model can match the performance of a Transformer on a learning curve prediction task.

Citations (1)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.