Geometry-enhanced Pre-training on Interatomic Potentials
Abstract: Machine learning interatomic potentials (MLIPs) enables molecular dynamics (MD) simulations with ab initio accuracy and has been applied to various fields of physical science. However, the performance and transferability of MLIPs are limited by insufficient labeled training data, which require expensive ab initio calculations to obtain the labels, especially for complex molecular systems. To address this challenge, we design a novel geometric structure learning paradigm that consists of two stages. We first generate a large quantity of 3D configurations of target molecular system with classical molecular dynamics simulations. Then, we propose geometry-enhanced self-supervised learning consisting of masking, denoising, and contrastive learning to better capture the topology and 3D geometric information from the unlabeled 3D configurations. We evaluate our method on various benchmarks ranging from small molecule datasets to complex periodic molecular systems with more types of elements. The experimental results show that the proposed pre-training method can greatly enhance the accuracy of MLIPs with few extra computational costs and works well with different invariant or equivariant graph neural network architectures. Our method improves the generalization capability of MLIPs and helps to realize accurate MD simulations for complex molecular systems.
- Senftle, T.P., Hong, S., Islam, M.M., Kylasa, S.B., Zheng, Y., Shin, Y.K., Junkermeier, C., Engel-Herbert, R., Janik, M.J., Aktulga, H.M., et al.: The reaxff reactive force-field: development, applications and future directions. npj Computational Materials 2(1), 1–14 (2016) Karplus and Petsko [1990] Karplus, M., Petsko, G.A.: Molecular dynamics simulations in biology. Nature 347, 631–639 (1990) [4] Yao, N., Chen, X., Fu, Z.-H., Zhang, Q.: Applying classical, Ab Initio , and machine-learning molecular dynamics simulations to the liquid electrolyte for rechargeable batteries 122(12), 10970–11021 https://doi.org/10.1021/acs.chemrev.1c00904 . Accessed 2023-01-17 Kaminski et al. [2001] Kaminski, G.A., Friesner, R.A., Tirado-Rives, J., Jorgensen, W.L.: Evaluation and reparametrization of the opls-aa force field for proteins via comparison with accurate quantum chemical calculations on peptides. The Journal of Physical Chemistry B 105(28), 6474–6487 (2001) Car and Parrinello [1985] Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Karplus, M., Petsko, G.A.: Molecular dynamics simulations in biology. Nature 347, 631–639 (1990) [4] Yao, N., Chen, X., Fu, Z.-H., Zhang, Q.: Applying classical, Ab Initio , and machine-learning molecular dynamics simulations to the liquid electrolyte for rechargeable batteries 122(12), 10970–11021 https://doi.org/10.1021/acs.chemrev.1c00904 . Accessed 2023-01-17 Kaminski et al. [2001] Kaminski, G.A., Friesner, R.A., Tirado-Rives, J., Jorgensen, W.L.: Evaluation and reparametrization of the opls-aa force field for proteins via comparison with accurate quantum chemical calculations on peptides. The Journal of Physical Chemistry B 105(28), 6474–6487 (2001) Car and Parrinello [1985] Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Yao, N., Chen, X., Fu, Z.-H., Zhang, Q.: Applying classical, Ab Initio , and machine-learning molecular dynamics simulations to the liquid electrolyte for rechargeable batteries 122(12), 10970–11021 https://doi.org/10.1021/acs.chemrev.1c00904 . Accessed 2023-01-17 Kaminski et al. [2001] Kaminski, G.A., Friesner, R.A., Tirado-Rives, J., Jorgensen, W.L.: Evaluation and reparametrization of the opls-aa force field for proteins via comparison with accurate quantum chemical calculations on peptides. The Journal of Physical Chemistry B 105(28), 6474–6487 (2001) Car and Parrinello [1985] Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kaminski, G.A., Friesner, R.A., Tirado-Rives, J., Jorgensen, W.L.: Evaluation and reparametrization of the opls-aa force field for proteins via comparison with accurate quantum chemical calculations on peptides. The Journal of Physical Chemistry B 105(28), 6474–6487 (2001) Car and Parrinello [1985] Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Karplus, M., Petsko, G.A.: Molecular dynamics simulations in biology. Nature 347, 631–639 (1990) [4] Yao, N., Chen, X., Fu, Z.-H., Zhang, Q.: Applying classical, Ab Initio , and machine-learning molecular dynamics simulations to the liquid electrolyte for rechargeable batteries 122(12), 10970–11021 https://doi.org/10.1021/acs.chemrev.1c00904 . Accessed 2023-01-17 Kaminski et al. [2001] Kaminski, G.A., Friesner, R.A., Tirado-Rives, J., Jorgensen, W.L.: Evaluation and reparametrization of the opls-aa force field for proteins via comparison with accurate quantum chemical calculations on peptides. The Journal of Physical Chemistry B 105(28), 6474–6487 (2001) Car and Parrinello [1985] Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Yao, N., Chen, X., Fu, Z.-H., Zhang, Q.: Applying classical, Ab Initio , and machine-learning molecular dynamics simulations to the liquid electrolyte for rechargeable batteries 122(12), 10970–11021 https://doi.org/10.1021/acs.chemrev.1c00904 . Accessed 2023-01-17 Kaminski et al. [2001] Kaminski, G.A., Friesner, R.A., Tirado-Rives, J., Jorgensen, W.L.: Evaluation and reparametrization of the opls-aa force field for proteins via comparison with accurate quantum chemical calculations on peptides. The Journal of Physical Chemistry B 105(28), 6474–6487 (2001) Car and Parrinello [1985] Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kaminski, G.A., Friesner, R.A., Tirado-Rives, J., Jorgensen, W.L.: Evaluation and reparametrization of the opls-aa force field for proteins via comparison with accurate quantum chemical calculations on peptides. The Journal of Physical Chemistry B 105(28), 6474–6487 (2001) Car and Parrinello [1985] Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Yao, N., Chen, X., Fu, Z.-H., Zhang, Q.: Applying classical, Ab Initio , and machine-learning molecular dynamics simulations to the liquid electrolyte for rechargeable batteries 122(12), 10970–11021 https://doi.org/10.1021/acs.chemrev.1c00904 . Accessed 2023-01-17 Kaminski et al. [2001] Kaminski, G.A., Friesner, R.A., Tirado-Rives, J., Jorgensen, W.L.: Evaluation and reparametrization of the opls-aa force field for proteins via comparison with accurate quantum chemical calculations on peptides. The Journal of Physical Chemistry B 105(28), 6474–6487 (2001) Car and Parrinello [1985] Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kaminski, G.A., Friesner, R.A., Tirado-Rives, J., Jorgensen, W.L.: Evaluation and reparametrization of the opls-aa force field for proteins via comparison with accurate quantum chemical calculations on peptides. The Journal of Physical Chemistry B 105(28), 6474–6487 (2001) Car and Parrinello [1985] Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Kaminski, G.A., Friesner, R.A., Tirado-Rives, J., Jorgensen, W.L.: Evaluation and reparametrization of the opls-aa force field for proteins via comparison with accurate quantum chemical calculations on peptides. The Journal of Physical Chemistry B 105(28), 6474–6487 (2001) Car and Parrinello [1985] Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Car, R., Parrinello, M.: Unified approach for molecular dynamics and density-functional theory. Physical review letters 55(22), 2471 (1985) [7] Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Butler, K.T., Davies, D.W., Cartwright, H., Isayev, O., Walsh, A.: Machine learning for molecular and materials science 559(7715), 547–555 https://doi.org/10.1038/s41586-018-0337-2 . Accessed 2023-02-11 [8] Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Noé, F., Tkatchenko, A., Müller, K.-R., Clementi, C.: Machine learning for molecular simulation 71(1), 361–390 https://doi.org/10.1146/annurev-physchem-042018-052331 . Accessed 2023-01-17 [9] Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Unke, O.T., Chmiela, S., Sauceda, H.E., Gastegger, M., Poltavsky, I., Schütt, K.T., Tkatchenko, A., Müller, K.-R.: Machine learning force fields 121(16), 10142–10186 https://doi.org/10.1021/acs.chemrev.0c01111 . Accessed 2023-01-17 Gilmer et al. [2017] Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: International Conference on Machine Learning, pp. 1263–1272 (2017). PMLR Schütt et al. [2017] Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Schütt, K., Kindermans, P.-J., Sauceda Felix, H.E., Chmiela, S., Tkatchenko, A., Müller, K.-R.: Schnet: A continuous-filter convolutional neural network for modeling quantum interactions. Advances in neural information processing systems 30 (2017) Gasteiger et al. [2020] Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Gasteiger, J., Groß, J., Günnemann, S.: Directional message passing for molecular graphs. arXiv preprint arXiv:2003.03123 (2020) Thomas et al. [2018] Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Thomas, N., Smidt, T., Kearnes, S., Yang, L., Li, L., Kohlhoff, K., Riley, P.: Tensor field networks: Rotation-and translation-equivariant neural networks for 3d point clouds. arXiv preprint arXiv:1802.08219 (2018) Batzner et al. [2022] Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Batzner, S., Musaelian, A., Sun, L., Geiger, M., Mailoa, J.P., Kornbluth, M., Molinari, N., Smidt, T.E., Kozinsky, B.: E (3)-equivariant graph neural networks for data-efficient and accurate interatomic potentials. Nature communications 13(1), 2453 (2022) Villar et al. [2021] Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Villar, S., Hogg, D.W., Storey-Fisher, K., Yao, W., Blum-Smith, B.: Scalars are universal: Equivariant machine learning, structured like classical physics. Advances in Neural Information Processing Systems 34, 28848–28863 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Satorras et al. [2021] Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Satorras, V.G., Hoogeboom, E., Welling, M.: E (n) equivariant graph neural networks. In: International Conference on Machine Learning, pp. 9323–9332 (2021). PMLR Velickovic et al. [2019] Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Velickovic, P., Fedus, W., Hamilton, W.L., Liò, P., Bengio, Y., Hjelm, R.D.: Deep graph infomax. ICLR (Poster) 2(3), 4 (2019) Hassani and Khasahmadi [2020] Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Hassani, K., Khasahmadi, A.H.: Contrastive multi-view representation learning on graphs. In: International Conference on Machine Learning, pp. 4116–4126 (2020). PMLR Qiu et al. [2020] Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Qiu, J., Chen, Q., Dong, Y., Zhang, J., Yang, H., Ding, M., Wang, K., Tang, J.: Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1150–1160 (2020) Hu et al. [2019] Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Hu, W., Liu, B., Gomes, J., Zitnik, M., Liang, P., Pande, V., Leskovec, J.: Strategies for pre-training graph neural networks. arXiv preprint arXiv:1905.12265 (2019) You et al. [2020] You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- You, Y., Chen, T., Wang, Z., Shen, Y.: When does self-supervision help graph convolutional networks? In: International Conference on Machine Learning, pp. 10871–10880 (2020). PMLR Hou et al. [2022] Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Hou, Z., Liu, X., Cen, Y., Dong, Y., Yang, H., Wang, C., Tang, J.: Graphmae: Self-supervised masked graph autoencoders. In: Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining, pp. 594–604 (2022) Chen et al. [2022] Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Chen, H., Zhang, S., Xu, G.: Graph masked autoencoder. arXiv preprint arXiv:2202.08391 (2022) Kipf and Welling [2016] Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Kipf, T.N., Welling, M.: Variational graph auto-encoders. arXiv preprint arXiv:1611.07308 (2016) Kingma and Welling [2013] Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013) Zhou et al. [2023] Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Zhou, G., Gao, Z., Ding, Q., Zheng, H., Xu, H., Wei, Z., Zhang, L., Ke, G.: Uni-mol: A universal 3d molecular representation learning framework (2023) Liu et al. [2022] Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Liu, S., Guo, H., Tang, J.: Molecular geometry pretraining with se (3)-invariant denoising distance matching. arXiv preprint arXiv:2206.13602 (2022) Sun et al. [2020] Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Sun, K., Lin, Z., Zhu, Z.: Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 34, pp. 5892–5899 (2020) Zhu et al. [2020] Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Zhu, Y., Xu, Y., Yu, F., Wu, S., Wang, L.: Cagnn: Cluster-aware graph neural networks for unsupervised graph representation learning. arXiv preprint arXiv:2009.01674 (2020) Jin et al. [2021] Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Jin, W., Derr, T., Wang, Y., Ma, Y., Liu, Z., Tang, J.: Node similarity preserving graph convolutional networks. In: Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pp. 148–156 (2021) Jin et al. [2020] Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Jin, W., Derr, T., Liu, H., Wang, Y., Wang, S., Liu, Z., Tang, J.: Self-supervised learning on graphs: Deep insights and new direction. arXiv preprint arXiv:2006.10141 (2020) Peng et al. [2020] Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Peng, Z., Dong, Y., Luo, M., Wu, X.-M., Zheng, Q.: Self-supervised graph representation learning via global context prediction. arXiv preprint arXiv:2003.01604 (2020) You et al. [2020] You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. Advances in neural information processing systems 33, 5812–5823 (2020) You et al. [2021] You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- You, Y., Chen, T., Shen, Y., Wang, Z.: Graph contrastive learning automated. In: International Conference on Machine Learning, pp. 12121–12132 (2021). PMLR Wang et al. [2022] Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Wang, Y., Wang, J., Cao, Z., Barati Farimani, A.: Molecular contrastive learning of representations via graph neural networks. Nature Machine Intelligence 4(3), 279–287 (2022) Li et al. [2022] Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Li, S., Zhou, J., Xu, T., Dou, D., Xiong, H.: Geomgcl: Geometric graph contrastive learning for molecular property prediction. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 36, pp. 4541–4549 (2022) [38] Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Liu, S., Wang, H., Liu, W., Lasenby, J., Guo, H., Tang, J.: Pre-training molecular graph representation with 3d geometry. In: International Conference on Learning Representations Stärk et al. [2022] Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Stärk, H., Beaini, D., Corso, G., Tossou, P., Dallago, C., Günnemann, S., Liò, P.: 3d infomax improves gnns for molecular property prediction. In: International Conference on Machine Learning, pp. 20479–20502 (2022). PMLR Zhang et al. [2022] Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Zhang, D., Bi, H., Dai, F.-Z., Jiang, W., Zhang, L., Wang, H.: Dpa-1: Pretraining of attention-based deep potential model for molecular simulation. arXiv preprint arXiv:2208.08236 (2022) Wang et al. [2023] Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Wang, Y., Xu, C., Li, Z., Farimani, A.B.: Denoise pre-training on non-equilibrium molecules for accurate and transferable neural potentials. arXiv preprint arXiv:2303.02216 (2023) Chanussot et al. [2021] Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Chanussot, L., Das, A., Goyal, S., Lavril, T., Shuaibi, M., Riviere, M., Tran, K., Heras-Domingo, J., Ho, C., Hu, W., et al.: Open catalyst 2020 (oc20) dataset and community challenges. Acs Catalysis 11(10), 6059–6072 (2021) Smith et al. [2017] Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Smith, J.S., Isayev, O., Roitberg, A.E.: Ani-1: an extensible neural network potential with dft accuracy at force field computational cost. Chemical science 8(4), 3192–3203 (2017) Liu et al. [2022] Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Liu, Y., Wang, L., Liu, M., Lin, Y., Zhang, X., Oztekin, B., Ji, S.: Spherical message passing for 3d molecular graphs. In: International Conference on Learning Representations (ICLR) (2022) Gasteiger et al. [2021] Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Gasteiger, J., Becker, F., Günnemann, S.: Gemnet: Universal directional graph neural networks for molecules. Advances in Neural Information Processing Systems 34, 6790–6802 (2021) Schütt et al. [2021] Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Schütt, K., Unke, O., Gastegger, M.: Equivariant message passing for the prediction of tensorial properties and molecular spectra. In: International Conference on Machine Learning, pp. 9377–9388 (2021). PMLR Rappé et al. [1992] Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Rappé, A.K., Casewit, C.J., Colwell, K., Goddard III, W.A., Skiff, W.M.: Uff, a full periodic table force field for molecular mechanics and molecular dynamics simulations. Journal of the American chemical society 114(25), 10024–10035 (1992) He et al. [2022] He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- He, K., Chen, X., Xie, S., Li, Y., Dollár, P., Girshick, R.: Masked autoencoders are scalable vision learners. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 16000–16009 (2022) Vincent et al. [2008] Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Vincent, P., Larochelle, H., Bengio, Y., Manzagol, P.-A.: Extracting and composing robust features with denoising autoencoders. In: Proceedings of the 25th International Conference on Machine Learning, pp. 1096–1103 (2008) Chmiela et al. [2017] Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Chmiela, S., Tkatchenko, A., Sauceda, H.E., Poltavsky, I., Schütt, K.T., Müller, K.-R.: Machine learning of accurate energy-conserving molecular force fields. Science advances 3(5), 1603015 (2017) Fu et al. [2022] Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Fu, X., Wu, Z., Wang, W., Xie, T., Keten, S., Gomez-Bombarelli, R., Jaakkola, T.: Forces are not enough: Benchmark and critical evaluation for machine learning force fields with molecular simulations. arXiv preprint arXiv:2210.07237 (2022) Ramakrishnan et al. [2014] Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Ramakrishnan, R., Dral, P.O., Rupp, M., Von Lilienfeld, O.A.: Quantum chemistry structures and properties of 134 kilo molecules. Scientific data 1(1), 1–7 (2014) Thompson et al. [2022] Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Thompson, A.P., Aktulga, H.M., Berger, R., Bolintineanu, D.S., Brown, W.M., Crozier, P.S., Veld, P.J., Kohlmeyer, A., Moore, S.G., Nguyen, T.D., et al.: Lammps-a flexible simulation tool for particle-based materials modeling at the atomic, meso, and continuum scales. Computer Physics Communications 271, 108171 (2022) Jorgensen et al. [1983] Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Jorgensen, W.L., Chandrasekhar, J., Madura, J.D., Impey, R.W., Klein, M.L.: Comparison of simple potential functions for simulating liquid water. The Journal of chemical physics 79(2), 926–935 (1983) Bogojeski et al. [2020] Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Bogojeski, M., Vogt-Maranto, L., Tuckerman, M.E., Müller, K.-R., Burke, K.: Quantum chemical accuracy from density functional approximations via machine learning. Nature communications 11(1), 5223 (2020) Zhang et al. [2018] Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Zhang, L., Han, J., Wang, H., Car, R., Weinan, E.: Deep potential molecular dynamics: a scalable model with the accuracy of quantum mechanics. Physical review letters 120(14), 143001 (2018) Perdew et al. [1996] Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Perdew, J.P., Burke, K., Ernzerhof, M.: Generalized gradient approximation made simple. Physical review letters 77(18), 3865 (1996) Blöchl [1994] Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Blöchl, P.E.: Projector augmented-wave method. Physical review B 50(24), 17953 (1994) Loshchilov and Hutter [2017] Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017) Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
- Loshchilov, I., Hutter, F.: Decoupled weight decay regularization. arXiv preprint arXiv:1711.05101 (2017)
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.