Tensor network calculation of the logarithmic correction exponent in the XY model
Abstract: We study the logarithmic correction to the scaling of the first Lee-Yang (LY) zero in the classical $XY$ model on square lattices by using tensor renormalization group methods. In comparing the higher-order tensor renormalization group (HOTRG) and the loop-optimized tensor network renormalization (LoopTNR), we find that the entanglement filtering in LoopTNR is crucial to gaining high accuracy for the characterization of the logarithmic correction, while HOTRG still proposes approximate bounds for the zero location associated with two different bond-merging algorithms of the higher-order singular value decomposition and the oblique projectors. Using the LoopTNR data computed up to the system size of $L=1024$ in the $L \times L$ lattices, we estimate the logarithmic correction exponent $r = -0.0643(9)$ from the extrapolation of the finite-size effective exponent, which is comparable to the renormalization group prediction of $r = -1/16$.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.