Papers
Topics
Authors
Recent
Search
2000 character limit reached

Variational Bayes Made Easy

Published 27 Apr 2023 in cs.LG, cs.AI, and stat.ML | (2304.14251v2)

Abstract: Variational Bayes is a popular method for approximate inference but its derivation can be cumbersome. To simplify the process, we give a 3-step recipe to identify the posterior form by explicitly looking for linearity with respect to expectations of well-known distributions. We can then directly write the update by simply ``reading-off'' the terms in front of those expectations. The recipe makes the derivation easier, faster, shorter, and more general.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (15)
  1. C. M. Bishop. Pattern Recognition and Machine Learning (Information Science and Statistics). Springer-Verlag, Berlin, Heidelberg, 2006. ISBN 0387310738.
  2. Variational inference: A review for statisticians. Journal of the American statistical Association, 112(518):859–877, 2017.
  3. G. E. Hinton and D. Van Camp. Keeping the neural networks simple by minimizing the description length of the weights. In Annual Conference on Computational Learning Theory, pages 5–13, 1993.
  4. Stochastic variational inference. The Journal of Machine Learning Research, 14(1):1303–1347, 2013.
  5. T. Jaakkola and M. Jordan. A variational approach to Bayesian logistic regression problems and their extensions. In International conference on Artificial Intelligence and Statistics, 1996.
  6. M. E. Khan and W. Lin. Conjugate-computation variational inference: converting variational inference in non-conjugate models to inferences in conjugate models. In International Conference on Artificial Intelligence and Statistics, pages 878–887, 2017.
  7. M. E. Khan and H. Rue. The Bayesian learning rule. arXiv preprint arXiv:2107.04562, 2021.
  8. Faster stochastic variational inference using proximal-gradient methods with general divergence functions. In Proceedings of the Conference on Uncertainty in Artificial Intelligence, 2016.
  9. The Lie-group Bayesian learning rule. In International conference on Artificial Intelligence and Statistics, 2023.
  10. Fast and simple natural-gradient variational inference with mixture of exponential-family approximations. 2019.
  11. Tractable structured natural-gradient descent using local parameterizations. In International Conference on Machine Learning, pages 6680–6691. PMLR, 2021.
  12. U. Paquet. On the convergence of stochastic variational inference in Bayesian networks. NIPS Workshop on variational inference, 2014.
  13. Mean field theory for sigmoid belief networks. Journal of Artificial Intelligence Research, 4:61–76, 1996.
  14. Probabilistic principal component analysis. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 61(3):611–622, 1999.
  15. J. Winn and C. M. Bishop. Variational message passing. Journal of Machine Learning Research, 6(Apr):661–694, 2005.
Citations (1)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.