Machine learning complete intersection Calabi-Yau 3-folds
Abstract: Gaussian process regression, kernel support vector regression, the random forest, extreme gradient boosting, and the generalized linear model algorithms are applied to data of complete intersection Calabi?Yau threefolds. It is shown that Gaussian process regression is the most suitable for learning the Hodge number h2,1in terms of h1,1. The performance of this regression algorithm is such that the Pearson correlation coefficient for the validation set is R2 = 0.9999999995 with a Root Mean Square Error RMSE = 0.0002895011. As for the calibration set, these two parameters are as follows: R2 = 0.9999999994 and RMSE = 0.0002854348. The training error and the cross-validation error of this regression are 10-9 and 1.28 * 10-7, respectively. Learning the Hodge number h1,1in terms of h2,1 yields R2 = 1.000000 and RMSE = 7.395731 * 10-5 for the validation set of the Gaussian Process regression.
- M.B. Green and J.H. Schwarz, Caltech preprint 68-1182 (1984)
- Abhijit Ghatak, Machine Learning with R: Chapter 4. Springer Nature Singapore Pte Ltd. 2017.
- Jian Qing Shi and Taeryon Choi, Gaussian Process Regression Analysis for Functional Data. CRC Press Taylor & Francis Group (2011).
- Alboukadel Kassambara, Multivariate Analysis I : Practical Guide To Cluster Analysis in R sthda.com Edition 1.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.