Papers
Topics
Authors
Recent
Search
2000 character limit reached

Fusion of Gaussian Processes Predictions with Monte Carlo Sampling

Published 3 Mar 2024 in cs.LG and stat.ML | (2403.01389v1)

Abstract: In science and engineering, we often work with models designed for accurate prediction of variables of interest. Recognizing that these models are approximations of reality, it becomes desirable to apply multiple models to the same data and integrate their outcomes. In this paper, we operate within the Bayesian paradigm, relying on Gaussian processes as our models. These models generate predictive probability density functions (pdfs), and the objective is to integrate them systematically, employing both linear and log-linear pooling. We introduce novel approaches for log-linear pooling, determining input-dependent weights for the predictive pdfs of the Gaussian processes. The aggregation of the pdfs is realized through Monte Carlo sampling, drawing samples of weights from their posterior. The performance of these methods, as well as those based on linear pooling, is demonstrated using a synthetic dataset.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (20)
  1. “Fusion of probability density functions,” Proceedings of the IEEE, vol. 110, no. 4, pp. 404–453, 2022.
  2. T. G. Dietterich, “Ensemble methods in machine learning,” in International Workshop on Multiple Classifier Systems. Springer, 2000, pp. 1–15.
  3. Bayesian Theory, vol. 405, John Wiley & Sons, 2009.
  4. P. Domingos, “Bayesian averaging of classifiers and the overfitting problem,” in Proceedings of the Seventeenth International Conference on Machine Learning, 2000, pp. 223–230.
  5. T. P. Minka, “Bayesian model averaging is not model combination,” Tech. Rep., MIT Media Lab, 2000.
  6. D. H. Wolpert, “Stacked generalization,” Neural Networks, vol. 5, no. 2, pp. 241–259, 1992.
  7. “Using stacking to average bayesian predictive distributions (with discussion),” Bayesian Analysis, vol. 13, no. 3, pp. 917–1007, 2018.
  8. “Bayesian hierarchical stacking: Some models are (somewhere) useful,” Bayesian Analysis, vol. 17, no. 4, pp. 1043–1071, 2022.
  9. “Adaptive mixtures of local experts,” Neural Computation, vol. 3, no. 1, pp. 79–87, 1991.
  10. Gaussian Processes for Machine Learning, Adaptive Computation and Machine Learning series. MIT Press, 2005.
  11. A. Rahimi and B. Recht, “Random features for large-scale kernel machines,” Advances in Neural Information Processing Systems, vol. 20, 2007.
  12. “Sparse spectrum Gaussian process regression,” The Journal of Machine Learning Research, vol. 11, pp. 1865–1881, 2010.
  13. C. P. Robert and G. Casella, Monte Carlo Statistical Methods, Springer, 2004.
  14. “Hybrid Monte Carlo,” Physics Letters B, vol. 195, no. 2, pp. 216–222, 1987.
  15. V. Tresp, “Mixtures of Gaussian processes,” Advances in Neural Nnformation Processing Systems, vol. 13, 2000.
  16. “Regression with input-dependent noise: A Gaussian process treatment,” Advances in Neural Information Processing Systems, vol. 10, 1997.
  17. C. Genest, “A characterization theorem for externally Bayesian groups,” The Annals of Statistics, pp. 1100–1105, 1984.
  18. Y. Cao and D. J. Fleet, “Generalized product of experts for automatic and principled fusion of Gaussian process predictions,” arXiv preprint arXiv:1410.7827, 2014.
  19. M. Deisenroth and J. W. Ng, “Distributed Gaussian processes,” in International Conference on Machine Learning. PMLR, 2015, pp. 1481–1490.
  20. “Composable effects for flexible and accelerated probabilistic programming in NumPyro,” arXiv preprint 1912.11554, 2019.
Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 0 likes about this paper.