Papers
Topics
Authors
Recent
Search
2000 character limit reached

Equivariant Transformer is all you need

Published 20 Oct 2023 in hep-lat, cond-mat.dis-nn, and cs.LG | (2310.13222v1)

Abstract: Machine learning, deep learning, has been accelerating computational physics, which has been used to simulate systems on a lattice. Equivariance is essential to simulate a physical system because it imposes a strong induction bias for the probability distribution described by a machine learning model. This reduces the risk of erroneous extrapolation that deviates from data symmetries and physical laws. However, imposing symmetry on the model sometimes occur a poor acceptance rate in self-learning Monte-Carlo (SLMC). On the other hand, Attention used in Transformers like GPT realizes a large model capacity. We introduce symmetry equivariant attention to SLMC. To evaluate our architecture, we apply it to our proposed new architecture on a spin-fermion model on a two-dimensional lattice. We find that it overcomes poor acceptance rates for linear models and observe the scaling law of the acceptance rate as in the LLMs with Transformers.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (16)
  1. Advances in machine-learning-based sampling motivated by lattice quantum chromodynamics. Nature Rev. Phys., 5(9):526–535, 2023.
  2. Gauge covariant neural network for 4 dimensional non-abelian gauge theory. March 2021.
  3. Highly accurate protein structure prediction with alphafold. Nature, 596:583–589, 2021.
  4. An image is worth 16x16 words: Transformers for image recognition at scale, 2021.
  5. TorchMD-NET: Equivariant transformers for neural network based molecular potentials. February 2022.
  6. E(3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nat. Commun., 13(1):2453, May 2022.
  7. Masanobu Horie. E(n)-Equivariant Graph Neural Networks Emulating Mesh-Discretized Physics. Doctor of philosophy in engineering, University of Tsukuba, March 2023.
  8. Self-learning monte carlo with equivariant transformer, 2023.
  9. Self-learning monte carlo method. Physical Review B, 95(4), jan 2017.
  10. Physics-informed machine learning. Nature Reviews Physics, 3:422–440, June 2021. Accepted: 31 March 2021, Published: 24 May 2021.
  11. Efficient langevin simulation of coupled classical fields and fermions. Phys. Rev. B, 88(23):235101, December 2013.
  12. Self-learning monte carlo method and cumulative update in fermion systems. Phys. Rev. B, 95(24):241104, June 2017.
  13. Sample generation for the spin-fermion model using neural networks. Phys. Rev. B, 106(20):205112, November 2022.
  14. Effective Ruderman–Kittel–Kasuya–Yosida-like interaction in diluted double-exchange model: Self-learning monte carlo approach. J. Phys. Soc. Jpn., 90(3):034711, March 2021.
  15. Self-learning monte carlo with deep neural networks. Phys. Rev. B, 97(20):205140, May 2018.
  16. Scaling laws for neural language models, 2020.
Citations (3)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Authors (2)

Collections

Sign up for free to add this paper to one or more collections.