Papers
Topics
Authors
Recent
Search
2000 character limit reached

Transformer-Based Neural Networks Backflow for Strongly Correlated Electronic Structure

Published 30 Sep 2025 in quant-ph | (2509.25720v1)

Abstract: Solving the electronic Schr\"odinger equation for strongly correlated systems remains one of the grand challenges in quantum chemistry. Here we demonstrate that Transformer architectures can be adapted to capture the complex grammar of electronic correlations through neural network backflow. In this approach, electronic configurations are processed as token sequences, where attention layers learn non-local orbital correlations and token-specific neural networks map these contextual representations into backflowed orbitals. Application to strongly correlated iron-sulfur clusters validates our approach: for $\left[\mathrm{Fe}_2 \mathrm{~S}_2\left(\mathrm{SCH}_3\right)_4\right]{2-}$ ([2Fe-2S]) (30e,20o), the ground-state energy within chemical accuracy of DMRG while predicting magnetic exchange coupling constants closer to experimental values than all compared methods including DMRG, CCSD(T), and recent neural network approaches. For $\left[\mathrm{Fe}_4 \mathrm{S}_4\left(\mathrm{SCH}_3\right)_4\right]{2-}$ ([4Fe-4S]) (54e,36o), we match DMRG energies and accurately reproduce detailed spin-spin correlation patterns between all Fe centers. The approach scales favorably to large active spaces inaccessible to exact methods, with distributed VMC optimization enabling stable convergence. These results establish Transformer-based backflow as a powerful variational ansatz for strongly correlated electronic structure, achieving superior magnetic property predictions while maintaining chemical accuracy in total energies.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 1 like about this paper.