Papers
Topics
Authors
Recent
Search
2000 character limit reached

An efficient forgetting-aware fine-tuning framework for pretrained universal machine-learning interatomic potentials

Published 18 Jun 2025 in cond-mat.mtrl-sci | (2506.15223v1)

Abstract: Pretrained universal machine-learning interatomic potentials (MLIPs) have revolutionized computational materials science by enabling rapid atomistic simulations as efficient alternatives to ab initio methods. Fine-tuning pretrained MLIPs offers a practical approach to improving accuracy for materials and properties where predictive performance is insufficient. However, this approach often induces catastrophic forgetting, undermining the generalizability that is a key advantage of pretrained MLIPs. Herein, we propose reEWC, an advanced fine-tuning strategy that integrates Experience Replay and Elastic Weight Consolidation (EWC) to effectively balance forgetting prevention with fine-tuning efficiency. Using Li$_6$PS$_5$Cl (LPSC), a sulfide-based Li solid-state electrolyte, as a fine-tuning target, we show that reEWC significantly improves the accuracy of a pretrained MLIP, resolving well-known issues of potential energy surface softening and overestimated Li diffusivities. Moreover, reEWC preserves the generalizability of the pretrained MLIP and enables knowledge transfer to chemically distinct systems, including other sulfide, oxide, nitride, and halide electrolytes. Compared to Experience Replay and EWC used individually, reEWC delivers clear synergistic benefits, mitigating their respective limitations while maintaining computational efficiency. These results establish reEWC as a robust and effective solution for continual learning in MLIPs, enabling universal models that can advance materials research through large-scale, high-throughput simulations across diverse chemistries.

Summary

Paper to Video (Beta)

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.