Papers
Topics
Authors
Recent
Search
2000 character limit reached

Infinite Neural Network Quantum States: Entanglement and Training Dynamics

Published 1 Dec 2021 in quant-ph, cond-mat.dis-nn, cs.LG, and hep-th | (2112.00723v2)

Abstract: We study infinite limits of neural network quantum states ($\infty$-NNQS), which exhibit representation power through ensemble statistics, and also tractable gradient descent dynamics. Ensemble averages of Renyi entropies are expressed in terms of neural network correlators, and architectures that exhibit volume-law entanglement are presented. A general framework is developed for studying the gradient descent dynamics of neural network quantum states (NNQS), using a quantum state neural tangent kernel (QS-NTK). For $\infty$-NNQS the training dynamics is simplified, since the QS-NTK becomes deterministic and constant. An analytic solution is derived for quantum state supervised learning, which allows an $\infty$-NNQS to recover any target wavefunction. Numerical experiments on finite and infinite NNQS in the transverse field Ising model and Fermi Hubbard model demonstrate excellent agreement with theory. $\infty$-NNQS opens up new opportunities for studying entanglement and training dynamics in other physics applications, such as in finding ground states.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (40)
  1. G. Carleo and M. Troyer, Science 355, 602 (2017), https://www.science.org/doi/pdf/10.1126/science.aag2302 .
  2. G. Cybenko, Mathematics of Control, Signals and Systems 2, 303 (1989).
  3. K. Hornik, Neural Networks 4, 251 (1991).
  4. X. Gao and L.-M. Duan, Nature Communications 8, 662 (2017).
  5. O. Sharir, A. Shashua,  and G. Carleo, “Neural tensor contractions and the expressive power of deep neural quantum states,”  (2021), arXiv:2103.10293 [quant-ph] .
  6. D. Luo, G. Carleo, B. K. Clark,  and J. Stokes, “Gauge equivariant neural networks for quantum lattice gauge theories,”  (2020), arXiv:2012.05232 [cond-mat.str-el] .
  7. D. Luo, Z. Chen, K. Hu, Z. Zhao, V. M. Hur,  and B. K. Clark, “Gauge invariant autoregressive neural networks for quantum lattice models,”  (2021), arXiv:2101.07243 [cond-mat.str-el] .
  8. Y. Huang and J. E. Moore, Phys. Rev. Lett. 127, 170601 (2021).
  9. X. Han and S. A. Hartnoll, Physical Review X 10 (2020), 10.1103/physrevx.10.011069.
  10. D. Luo and B. K. Clark, Physical Review Letters 122 (2019), 10.1103/physrevlett.122.226401.
  11. J. Hermann, Z. Schätzle,  and F. Noé, “Deep neural network solution of the electronic schrödinger equation,”  (2019), arXiv:1909.08423 [physics.comp-ph] .
  12. J. Carrasquilla, D. Luo, F. Pérez, A. Milsted, B. K. Clark, M. Volkovs,  and L. Aolita, “Probabilistic simulation of quantum circuits with the transformer,”  (2019), arXiv:1912.11052 .
  13. I. L. Gutiérrez and C. B. Mendl, “Real time evolution with neural-network quantum states,”  (2020), arXiv:1912.08831 [cond-mat.dis-nn] .
  14. M. Schmitt and M. Heyl, Physical Review Letters 125 (2020), 10.1103/physrevlett.125.100503.
  15. N. Yoshioka and R. Hamazaki, Phys. Rev. B 99, 214306 (2019).
  16. M. J. Hartmann and G. Carleo, Phys. Rev. Lett. 122, 250502 (2019).
  17. A. Nagy and V. Savona, Phys. Rev. Lett. 122, 250501 (2019).
  18. M. Medvidović and G. Carleo, npj Quantum Information 7 (2021), 10.1038/s41534-021-00440-z.
  19. J. Wang, Z. Chen, D. Luo, Z. Zhao, V. M. Hur,  and B. K. Clark, “Spacetime neural network for high dimensional quantum dynamics,”  (2021), arXiv:2108.02200 [cond-mat.dis-nn] .
  20. D. A. Roberts, S. Yaida,  and B. Hanin, “The principles of deep learning theory,”  (2021), arXiv:2106.10165 [cs.LG] .
  21. R. M. Neal, BAYESIAN LEARNING FOR NEURAL NETWORKS, Ph.D. thesis, University of Toronto (1995).
  22. C. K. Williams, in Advances in neural information processing systems (1997) pp. 295–301.
  23. J. Lee, Y. Bahri, R. Novak, S. S. Schoenholz, J. Pennington,  and J. Sohl-Dickstein, “Deep neural networks as gaussian processes,”  (2017), arXiv:1711.00165 [stat.ML] .
  24. G. Yang, ArXiv abs/1902.04760 (2019).
  25. G. Yang, arXiv e-prints , arXiv:1910.12478 (2019), arXiv:1910.12478 [cs.NE] .
  26. G. Yang, ArXiv abs/2006.14548 (2020).
  27. Z. Wang and E. J. Davis, Physical Review A 102 (2020), 10.1103/physreva.102.062413.
  28. J. Halverson, arXiv preprint arXiv:2112.04527  (2021).
  29. D. N. Page, Phys. Rev. Lett. 71, 1291 (1993).
  30. T. Zhou and A. Nahum, Physical Review B 99 (2019), 10.1103/physrevb.99.174205.
  31. S. Yaida,   (2019), arXiv:1910.00019 [stat.ML] .
  32. J. Halverson, to appear .
  33. S.-i. Amari, arXiv e-prints , arXiv:2001.06931 (2020), arXiv:2001.06931 [stat.ML] .
  34. G. Yang and E. Littwin, “Tensor programs iib: Architectural universality of neural tangent kernel training dynamics,”  (2021), arXiv:2105.03703 [cs.LG] .
  35. K. Nakaji, H. Tezuka,  and N. Yamamoto, “Quantum-enhanced neural networks in the neural tangent kernel framework,”  (2021), arXiv:2109.03786 [quant-ph] .
  36. N. Shirai, K. Kubo, K. Mitarai,  and K. Fujii, “Quantum tangent kernel,”  (2021), arXiv:2111.02951 [quant-ph] .
  37. A. Zlokapa, H. Neven,  and S. Lloyd, “A quantum algorithm for training wide and deep classical neural networks,”  (2021), arXiv:2107.09200 [quant-ph] .
  38. W. Rudin, “Fourier analysis on groups,”  (John Wiley & Sons, Ltd, 1990).
  39. C. Tsallis, Journal of statistical physics 52, 479 (1988).
  40. E. Bianchi and P. Dona, Physical Review D 100, 105010 (2019).
Citations (6)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.