Papers
Topics
Authors
Recent
Search
2000 character limit reached

Subspace recursive Fermi-operator expansion strategies for large-scale DFT eigenvalue problems on HPC architectures

Published 11 Jan 2023 in physics.comp-ph and physics.chem-ph | (2301.04642v4)

Abstract: Quantum mechanical calculations for material modelling using Kohn-Sham density functional theory (DFT) involve the solution of a nonlinear eigenvalue problem for $N$ smallest eigenvector-eigenvalue pairs with $N$ proportional to the number of electrons in the material system. These calculations are computationally demanding and have asymptotic cubic scaling complexity with the number of electrons. Large-scale matrix eigenvalue problems arising from the discretization of the Kohn-Sham DFT equations employing a systematically convergent basis traditionally rely on iterative orthogonal projection methods, which are shown to be computationally efficient and scalable on massively parallel computing architectures. However, as the size of the material system increases, these methods are known to incur dominant computational costs through the Rayleigh-Ritz projection step of the discretized Kohn-Sham Hamiltonian matrix and the subsequent subspace diagonalization of the projected matrix. This work explores the potential of polynomial expansion approaches based on recursive Fermi-operator expansion as an alternative to the subspace diagonalization of the projected Hamiltonian matrix to reduce the computational cost. Subsequently, we perform a detailed comparison of various recursive polynomial expansion approaches to the traditional approach of explicit diagonalization on both multi-node CPU and GPU architectures and assess their relative performance in terms of accuracy, computational efficiency, scaling behaviour and energy efficiency.

Citations (4)

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.