Papers
Topics
Authors
Recent
Search
2000 character limit reached

Evaluating the Convergence Limit of Quantum Neural Tangent Kernel

Published 5 Dec 2023 in quant-ph | (2312.02451v1)

Abstract: Quantum variational algorithms have been one of major applications of quantum computing with current quantum devices. There are recent attempts to establish the foundation for these algorithms. A possible approach is to characterize the training dynamics with quantum neural tangent kernel. In this work, we construct the kernel for two models, Quantun Ensemble and Quantum Neural Network, and show the convergence of these models in the limit of infinitely many qubits. We also show applications of the kernel limit in regression tasks.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (20)
  1. Erfan Abedi, Salman Beigi and Leila Taghavi “Quantum lazy training” In Quantum 7 Verein zur Förderung des Open Access Publizierens in den Quantenwissenschaften, 2023, pp. 989
  2. “On exact computation with an infinitely wide neural net” In Advances in Neural Information Processing Systems 32, 2019
  3. Lenaic Chizat, Edouard Oyallon and Francis Bach “On lazy training in differentiable programming” In Advances in Neural Information Processing Systems 32, 2019
  4. “Integration with respect to the Haar measure on unitary, orthogonal and symplectic group” In Communications in Mathematical Physics 264.3 Springer, 2006, pp. 773–795
  5. Motohisa Fukuda, Robert König and Ion Nechita “RTNI—A symbolic integrator for Haar-random tensor networks” In Journal of Physics A: Mathematical and Theoretical 52.42 IOP Publishing, 2019, pp. 425303
  6. “Implicit bias of gradient descent on linear convolutional networks” In Advances in Neural Information Processing Systems 31, 2018
  7. “Implicit regularization in matrix factorization” In Advances in Neural Information Processing Systems 30, 2017
  8. Arthur Jacot, Franck Gabriel and Clément Hongler “Neural tangent kernel: Convergence and generalization in neural networks” In Advances in neural information processing systems 31, 2018
  9. “Foundations of modern probability” Springer, 1997
  10. Junyu Liu, Zexi Lin and Liang Jiang “Laziness, barren plateau, and noise in machine learning” In arXiv preprint arXiv:2206.09313, 2022
  11. “Analytic theory for the dynamics of wide quantum neural networks” In Physical Review Letters 130.15 APS, 2023, pp. 150601
  12. “Representation learning via quantum neural tangent kernels” In PRX Quantum 3.3 APS, 2022, pp. 030323
  13. Mehryar Mohri, Afshin Rostamizadeh and Ameet Talwalkar “Foundations of machine learning” MIT press, 2018
  14. Zbigniew Puchała and Jarosław Adam Miszczak “Symbolic integration with respect to the Haar measure on the unitary group” In arXiv preprint arXiv:1109.4244, 2011
  15. “How do infinite width bounded norm networks look in function space?” In Conference on Learning Theory, 2019, pp. 2667–2690 PMLR
  16. Maria Schuld “Supervised quantum machine learning models are kernel methods” In arXiv preprint arXiv:2101.11020, 2021
  17. Maria Schuld, Ryan Sweke and Johannes Jakob Meyer “Effect of data encoding on the expressive power of variational quantum-machine-learning models” In Physical Review A 103.3 APS, 2021, pp. 032430
  18. “Quantum tangent kernel” In arXiv:2111.02951, 2021
  19. “Kernel and rich regimes in overparametrized models” In Conference on Learning Theory, 2020, pp. 3635–3673 PMLR
  20. “Average fidelity between random quantum states” In Physical Review A 71.3 APS, 2005, pp. 032313

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (1)

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 1 like about this paper.