Papers
Topics
Authors
Recent
Search
2000 character limit reached

Data-driven Prior Learning for Bayesian Optimisation

Published 24 Nov 2023 in cs.LG and stat.ML | (2311.14653v2)

Abstract: Transfer learning for Bayesian optimisation has generally assumed a strong similarity between optimisation tasks, with at least a subset having similar optimal inputs. This assumption can reduce computational costs, but it is violated in a wide range of optimisation problems where transfer learning may nonetheless be useful. We replace this assumption with a weaker one only requiring the shape of the optimisation landscape to be similar, and analyse the recent method Prior Learning for Bayesian Optimisation - PLeBO - in this setting. By learning priors for the hyperparameters of the Gaussian process surrogate model we can better approximate the underlying function, especially for few function evaluations. We validate the learned priors and compare to a breadth of transfer learning approaches, using synthetic data and a recent air pollution optimisation problem as benchmarks. We show that PLeBO and prior transfer find good inputs in fewer evaluations.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (19)
  1. Transfer learning for Bayesian optimization: A survey. arXiv preprint arXiv:2302.05927, 2023.
  2. BoTorch: A framework for efficient Monte-Carlo Bayesian optimization. In Advances in Neural Information Processing Systems 33, 2020.
  3. Copernicus. Sentinel-5P TROPOMI Level 2 Nitrogen Dioxide total column products. Version 01, 2018. URL https://sentinels.copernicus.eu/web/sentinel/data-products/-/asset_publisher/fp37fc19FN8F/content/sentinel-5-precursor-level-2-nitrogen-dioxide. Processed by ESA. DOI: 10.5270/S5P-s4ljg54.
  4. What do you mean? The role of the mean function in bayesian optimisation. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference Companion, pages 1623–1631, 2020.
  5. Initializing Bayesian hyperparameter optimization via meta-learning. In Proceedings of the AAAI Conference on Artificial Intelligence, volume 29, 2015.
  6. Roman Garnett. Bayesian optimization. Cambridge University Press, 2023.
  7. Optimising placement of pollution sensors in windy environments. NeurIPS 2020 Workshop on AI for Earth Sciences, 2020.
  8. Bayesian optimisation for active monitoring of air pollution. Proceedings of the AAAI Conference on Artificial Intelligence, 36(11):11908–11916, 2022.
  9. The No-U-Turn sampler: Adaptively setting path lengths in Hamiltonian Monte Carlo. Journal of Machine Learning Research, 15(1):1593–1623, 2014.
  10. Efficient global optimization of expensive black-box functions. Journal of Global Optimization, 13(4):455–492, 1998.
  11. Approximate inference for fully Bayesian Gaussian process regression. Symposium on Advances in Approximate Bayesian Inference, pages 1–12, 2020. PMLR.
  12. PFNs4BO: In-context learning for Bayesian optimization. In International Conference on Machine Learning. PMLR, 2023.
  13. Learning search spaces for Bayesian optimization: Another view of hyperparameter transfer learning. Advances in Neural Information Processing Systems, 32, 2019.
  14. Composable effects for flexible and accelerated probabilistic programming in NumPyro. arXiv preprint arXiv:1912.11554, 2019.
  15. Carl Edward Rasmussen and Christopher K. I. Williams. Gaussian Processes for Machine Learning. MIT Press, 2006.
  16. Gaussian process optimization in the bandit setting: No regret and experimental design. IEEE Transactions on Information Theory, 58(5):3250–3265, 2012.
  17. Multi-task Bayesian optimization. Advances in neural information processing systems, 26, 2013.
  18. Pre-training helps Bayesian optimization too. arXiv preprint arXiv:2207.03084, 2022.
  19. Few-shot Bayesian optimization with deep kernel surrogates. In International Conference on Learning Representations, 2021.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 13 likes about this paper.