Papers
Topics
Authors
Recent
Search
2000 character limit reached

Efficient hyperparameter tuning for kernel ridge regression with Bayesian optimization

Published 1 Apr 2020 in physics.chem-ph and physics.comp-ph | (2004.00675v1)

Abstract: Machine learning methods usually depend on internal parameters -- so called hyperparameters -- that need to be optimized for best performance. Such optimization poses a burden on machine learning practitioners, requiring expert knowledge, intuition or computationally demanding brute-force parameter searches. We here address the need for more efficient, automated hyperparameter selection with Bayesian optimization. We apply this technique to the kernel ridge regression machine learning method for two different descriptors for the atomic structure of organic molecules, one of which introduces its own set of hyperparameters to the method. We identify optimal hyperparameter configurations and infer entire prediction error landscapes in hyperparameter space, that serve as visual guides for the hyperparameter dependence. We further demonstrate that for an increasing number of hyperparameters, Bayesian optimization becomes significantly more efficient in computational time than an exhaustive grid search -- the current default standard hyperparameter search method -- while delivering an equivalent or even better accuracy.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.