Papers
Topics
Authors
Recent
Search
2000 character limit reached

Combining Normalizing Flows and Quasi-Monte Carlo

Published 11 Jan 2024 in stat.CO and stat.ML | (2401.05934v1)

Abstract: Recent advances in machine learning have led to the development of new methods for enhancing Monte Carlo methods such as Markov chain Monte Carlo (MCMC) and importance sampling (IS). One such method is normalizing flows, which use a neural network to approximate a distribution by evaluating it pointwise. Normalizing flows have been shown to improve the performance of MCMC and IS. On the other side, (randomized) quasi-Monte Carlo methods are used to perform numerical integration. They replace the random sampling of Monte Carlo by a sequence which cover the hypercube more uniformly, resulting in better convergence rates for the error that plain Monte Carlo. In this work, we combine these two methods by using quasi-Monte Carlo to sample the initial distribution that is transported by the flow. We demonstrate through numerical experiments that this combination can lead to an estimator with significantly lower variance than if the flow was sampled with a classic Monte Carlo.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (48)
  1. M.S. Albergo, G. Kanwar and P.E. Shanahan “Flow-Based Generative Models for Markov Chain Monte Carlo in Lattice Field Theory” In Physical Review D 100.3, 2019, pp. 034515 DOI: 10.1103/PhysRevD.100.034515
  2. “The Importance Markov Chain” arXiv, 2023 DOI: 10.48550/arXiv.2207.08271
  3. George EP Box and Mervin E Muller “A Note on the Generation of Random Normal Deviates” In The annals of mathematical statistics 29.2 Institute of Mathematical Statistics, 1958, pp. 610–611
  4. “Randomization of Number Theoretic Methods for Multiple Integration” In SIAM Journal on Numerical Analysis 13.6 Society for Industrial and Applied Mathematics, 1976, pp. 904–914 DOI: 10.1137/0713071
  5. “Discrepancy Estimates for Variance Bounding Markov Chain Quasi-Monte Carlo” In Electronic Journal of Probability 19, 2014 DOI: 10.1214/EJP.v19-3132
  6. Josef Dick, Daniel Rudolf and Houying Zhu “A Weighted Discrepancy Bound of Quasi-Monte Carlo Importance Sampling” In Statistics & Probability Letters 149, 2019, pp. 100–106 DOI: 10.1016/j.spl.2019.01.014
  7. Josef Dick, Daniel Rudolf and Houying Zhu “Discrepancy Bounds for Uniformly Ergodic Markov Chain Quasi-Monte Carlo” In The Annals of Applied Probability 26.5 Institute of Mathematical Statistics, 2016, pp. 3178–3205 DOI: 10.1214/16-AAP1173
  8. Laurent Dinh, David Krueger and Yoshua Bengio “NICE: Non-linear Independent Components Estimation” arXiv, 2015 arXiv:1410.8516 [cs]
  9. Laurent Dinh, Jascha Sohl-Dickstein and Samy Bengio “Density Estimation Using Real NVP” arXiv, 2017 arXiv:1605.08803 [cs, stat]
  10. “Neural Spline Flows” In Advances in neural information processing systems 32, 2019
  11. Marylou Gabrié, Grant M. Rotskoff and Eric Vanden-Eijnden “Adaptive Monte Carlo Augmented with Normalizing Flows” In Proceedings of the National Academy of Sciences 119.10, 2022, pp. e2109420119 DOI: 10.1073/pnas.2109420119
  12. G Goertzel “Quota Sampling and Importance Functions in Stochastic Solution of Particle Problems”, 1949 URL: https://www.osti.gov/biblio/4405957
  13. John A. Gregory and Roger Delbourgo “Piecewise Rational Quadratic Interpolation to Monotonic Data” In IMA Journal of Numerical Analysis 2.2 Oxford University Press, 1982, pp. 123–130
  14. “On Sampling with Approximate Transport Maps” arXiv, 2023 arXiv:2302.04763 [cs, stat]
  15. John H Halton “On the Efficiency of Certain Quasi-Random Sequences of Points in Evaluating Multi-Dimensional Integrals” In Numerische Mathematik 2 Springer, 1960, pp. 84–90
  16. W.K. Hastings “Monte Carlo Sampling Methods Using Markov Chains and Their Applications” In Biometrika 57.1, 1970, pp. 97–109 DOI: 10.1093/biomet/57.1.97
  17. Zhijian He, Zhan Zheng and Xiaoqun Wang “On the Error Rate of Importance Sampling with Randomized Quasi-Monte Carlo” In SIAM Journal on Numerical Analysis 61.2, 2023, pp. 515–538 DOI: 10.1137/22M1510121
  18. Edmund Hlawka “Funktionen von beschränkter variatiou in der theorie der gleichverteilung” In Annali di Matematica Pura ed Applicata 54.1 Springer, 1961, pp. 325–333
  19. “Accelerated Monte Carlo Simulations with Restricted Boltzmann Machines” In Physical Review B 95.3 American Physical Society, 2017, pp. 035105 DOI: 10.1103/PhysRevB.95.035105
  20. Stephen Joe and Frances Y. Kuo “Constructing Sobol Sequences with Better Two-Dimensional Projections” In SIAM Journal on Scientific Computing 30.5, 2008, pp. 2635–2654 DOI: 10.1137/070709359
  21. Herman Kahn “Stochastic (Monte Carlo) Attenuation Analysis. June 14, 1949. R- 163.”, 1949
  22. “Adaptive Importance Sampling and Quasi-Monte Carlo Methods for 6G URLLC Systems” arXiv, 2023 arXiv:2303.03575 [eess, stat]
  23. Ivan Kobyzev, Simon J.D. Prince and Marcus A. Brubaker “Normalizing Flows: An Introduction and Review of Current Methods” In IEEE Transactions on Pattern Analysis and Machine Intelligence 43.11, 2021, pp. 3964–3979 DOI: 10.1109/TPAMI.2020.2992934
  24. JF Koksma “Een algemeene stelling uit de theorie der gelijkmatige verdeeling modulo 1” In Mathematica B (Zutphen) 11.7-11, 1942, pp. 43
  25. “Confidence Intervals for Randomized Quasi-Monte Carlo Estimators” In Winter Simulation Conference, 2023
  26. Sifan Liu “Langevin Quasi-Monte Carlo” arXiv, 2023 DOI: 10.48550/arXiv.2309.12664
  27. Jiří Matoušek “On the L2-Discrepancy for Anchored Boxes” In Journal of Complexity 14.4, 1998, pp. 527–556 DOI: 10.1006/jcom.1998.0489
  28. “Equation of State Calculations by Fast Computing Machines” In The Journal of Chemical Physics 21.6 American Institute of Physics, 1953, pp. 1087–1092 DOI: 10.1063/1.1699114
  29. “The Monte Carlo Method” In Journal of the American Statistical Association 44.247 [American Statistical Association, Taylor & Francis, Ltd.], 1949, pp. 335–341 JSTOR: 2280232
  30. “Flow Annealed Importance Sampling Bootstrap” In The Eleventh International Conference on Learning Representations, 2023
  31. “Neural Importance Sampling” In ACM Transactions on Graphics 38.5, 2019, pp. 145:1–145:19 DOI: 10.1145/3341156
  32. “Boltzmann Generators: Sampling Equilibrium States of Many-Body Systems with Deep Learning” In Science 365.6457, 2019, pp. eaaw1147 DOI: 10.1126/science.aaw1147
  33. Art B. Owen “A Randomized Halton Algorithm in R” arXiv, 2017 DOI: 10.48550/arXiv.1706.02808
  34. Art B. Owen “Halton Sequences Avoid the Origin” In SIAM Review 48.3, 2006, pp. 487–503 DOI: 10.1137/S0036144504441573
  35. Art B. Owen “Monte Carlo Theory, Methods and Examples”, 2013
  36. Art B. Owen “On Dropping the First Sobol’ Point” In Monte Carlo and Quasi-Monte Carlo Methods, Springer Proceedings in Mathematics & Statistics Cham: Springer International Publishing, 2022, pp. 71–86 DOI: 10.1007/978-3-030-98319-2˙4
  37. Art B. Owen “Randomly Permuted (t,m,s)𝑡𝑚𝑠(t,m,s)( italic_t , italic_m , italic_s )-Nets and (t,s)𝑡𝑠(t,s)( italic_t , italic_s )-Sequences” In Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing New York: Springer-Verlag, 1995, pp. 299–317
  38. Art B. Owen “Scrambled Net Variance for Integrals of Smooth Functions” In The Annals of Statistics 25.4 Institute of Mathematical Statistics, 1997, pp. 1541–1562
  39. Art B. Owen and Seth D. Tribble “A Quasi-Monte Carlo Metropolis Algorithm” In Proceedings of the National Academy of Sciences 102.25, 2005, pp. 8844–8849 DOI: 10.1073/pnas.0409596102
  40. “Normalizing Flows for Probabilistic Modeling and Inference” In The Journal of Machine Learning Research 22.1, 2021, pp. 57:2617–57:2680
  41. Matthew D. Parno and Youssef M. Marzouk “Transport Map Accelerated Markov Chain Monte Carlo” In SIAM/ASA Journal on Uncertainty Quantification 6.2, 2018, pp. 645–682 DOI: 10.1137/17M1134640
  42. Christian P. Robert and George Casella “Monte Carlo Statistical Methods”, Springer Texts in Statistics New York, NY: Springer New York, 2010 DOI: 10.1007/978-1-4757-4145-2
  43. “Quasi-Monte Carlo Methods in Python” In Journal of Open Source Software 8.84, 2023, pp. 5309 DOI: 10.21105/joss.05309
  44. Il’ya Meerovich Sobol’ “On the Distribution of Points in a Cube and the Approximate Evaluation of Integrals” In Zhurnal Vychislitel’noi Matematiki i Matematicheskoi Fiziki 7.4 Russian Academy of Sciences, Branch of Mathematical Sciences, 1967, pp. 784–802
  45. E.G. Tabak and Cristina V. Turner “A Family of Nonparametric Density Estimation Algorithms” In Communications on Pure and Applied Mathematics 66.2, 2013, pp. 145–164 DOI: 10.1002/cpa.21423
  46. Esteban G. Tabak and Eric Vanden-Eijnden “Density Estimation by Dual Ascent of the Log-Likelihood” In Communications in Mathematical Sciences 8.1 International Press of Boston, 2010, pp. 217–233
  47. Florian Wenzel, Alexander Buchholz and Stephan Mandt “Quasi-Monte Carlo Flows” In Proceedings of the 3rd Workshop on Bayesian Deep Learning, 2018
  48. Kaze WK Wong, Marylou Gabrié and Daniel Foreman-Mackey “flowMC: Normalizing-flow Enhanced Sampling Package for Probabilistic Inference in Jax” In arXiv preprint arXiv:2211.06397, 2022 arXiv:2211.06397
Citations (1)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We found no open problems mentioned in this paper.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 4 likes about this paper.