Papers
Topics
Authors
Recent
Search
2000 character limit reached

Estimating the Reynolds number dependence of the chaotic attractor dimension in two-dimensional Kolmogorov flow

Published 27 May 2025 in physics.flu-dyn | (2505.21361v1)

Abstract: Deep autoencoder neural networks can generate highly accurate, low-order representations of turbulence. We design a new family of autoencoders which are a combination of a 'dense-block' encoder-decoder structure (Page et al, J. Fluid Mech. 991, 2024), an 'implicit rank minimization' series of linear layers acting on the embeddings (Zeng et al, Mach. Learn. Sci. Tech. 5, 2024) and a full discrete+continuous symmetry reduction. These models are applied to two-dimensional turbulence in Kolmogorov flow for a range of Reynolds numbers $25 \leq Re \leq 400$, and used to estimate the dimension of the chaotic attractor, $d_{\mathcal A}(Re)$. We find that the dimension scales like $\sim Re{1/3}$ -- much weaker than known bounds on the global attractor which grow like $Re{4/3}$. In addition, two-dimensional maps of the latent space in our models reveal a rich structure not seen in previous studies, including multiple classes of high-dissipation events at lower $Re$ which guide bursting trajectories. We visualize the embeddings of large numbers of "turbulent" unstable periodic orbits, which the model indicates are distinct (in terms of features) from any flow snapshot in a large turbulent dataset, suggesting their dynamical irrelevance. This is in sharp contrast to their appearance in more traditional low-dimensional projections, in which they appear to lie within the turbulent attractor.

Authors (2)

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.