2000 character limit reached
Attention-Based Foundation Model for Quantum States
Published 12 Dec 2025 in cond-mat.str-el | (2512.11962v1)
Abstract: We present an attention-based foundation model architecture for learning and predicting quantum states across Hamiltonian parameters, system sizes, and physical systems. Using only basis configurations and physical parameters as inputs, our trained neural network is able to produce highly accurate ground state wavefunctions. For example, we build the phase diagram for the 2D square-lattice $t-V$ model with $N$ particles, from only 18 parameters $(V/t,N)$. Thus, our architecture provides a basis for building a universal foundation model for quantum matter.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.