GPUTB-2:An efficient E(3) network method for learning high-precision orthogonal Hamiltonian
Abstract: Although equivariant neural networks have become a cornerstone for learning electronic Hamiltonians, the intrinsic non-orthogonality of linear combinations of atomic orbitals (LCAO) basis sets poses a fundamental challenge. The computational cost of Hamiltonian orthogonalization scales as O(N3), which severely hinders electronic structure calculations for large-scale systems containing hundreds of thousands to millions of atoms. To address this issue, we develop GPUTB-2, a framework that learns implicitly orthogonality-preserving Hamiltonians by training directly on electronic band structures. Benefiting from an E(3)-equivariant network accelerated by Gaunt tensor product and SO(2) tensor product layers, GPUTB-2 achieves significantly higher accuracy than GPUTB across multiple benchmark systems. Moreover, GPUTB-2 accurately predicts large-scale electronic structures, including transport properties of temperature-perturbed SnSe and the band structures of magic-angle twisted bilayer graphene. By further integrating this framework with the linear-scaling quantum transport (LSQT) method, we investigate the electronic properties of million-atom amorphous graphene and uncover pressure-induced electronic structure transitions in more complex amorphous silicon. Together, these results establish GPUTB-2 as a high-accuracy and scalable approach for predicting orthogonal Hamiltonians.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.