Papers
Topics
Authors
Recent
Search
2000 character limit reached

Featurizing Koopman Mode Decomposition For Robust Forecasting

Published 14 Dec 2023 in math.DS, math-ph, math.MP, and stat.ML | (2312.09146v5)

Abstract: This article introduces an advanced Koopman mode decomposition (KMD) technique -- coined Featurized Koopman Mode Decomposition (FKMD) -- that uses delay embedding and a learned Mahalanobis distance to enhance analysis and prediction of high dimensional dynamical systems. The delay embedding expands the observation space to better capture underlying manifold structure, while the Mahalanobis distance adjusts observations based on the system's dynamics. This aids in featurizing KMD in cases where good features are not a priori known. We show that FKMD improves predictions for a high-dimensional linear oscillator, a high-dimensional Lorenz attractor that is partially observed, and a cell signaling problem from cancer research.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (29)
  1. I. Mezić, Nonlinear Dynamics 41, 309 (2005).
  2. I. Mezić, Not. Am. Math. Soc. 68, 1087 (2021).
  3. B. O. Koopman, Proceedings of the National Academy of Sciences 17, 315 (1931).
  4. J. Koopman, Annu. Rev. Public Health 25, 303 (2004).
  5. J. H. Tu, Dynamic mode decomposition: Theory and applications, Ph.D. thesis, Princeton University (2013).
  6. A. M. DeGennaro and N. M. Urban, SIAM Journal on Scientific Computing 41, A1482 (2019).
  7. T. Kawashima and H. Hino, Neural Computation 35, 82 (2022).
  8. I. Mezić, Annual review of fluid mechanics 45, 357 (2013).
  9. S. Bagheri, Journal of Fluid Mechanics 726, 596 (2013).
  10. H. Arbabi and I. Mezić, Physical Review Fluids 2, 124402 (2017).
  11. A. Rahimi and B. Recht, Advances in neural information processing systems 20 (2007).
  12. F. Nüske and S. Klus, arXiv preprint arXiv:2306.00849  (2023).
  13. E. N. Lorenz, in Proc. Seminar on predictability, Vol. 1 (Reading, 1996).
  14. H. Arbabi and I. Mezic, SIAM Journal on Applied Dynamical Systems 16, 2096 (2017).
  15. K.-C. Li, Journal of the American Statistical Association 86, 316 (1991).
  16. F. Takens, in Dynamical Systems and Turbulence, Warwick 1980: proceedings of a symposium held at the University of Warwick 1979/80 (Springer, 2006) pp. 366–381.
  17. T. Khanna, Foundations of neural networks (Addison-Wesley Longman Publishing Co., Inc., 1990).
  18. B. Cheng and D. M. Titterington, Statistical science , 2 (1994).
  19. J. A. Tropp and R. J. Webber, arXiv preprint arXiv:2306.12418  (2023).
  20. P. E. Smith, The Journal of chemical physics 111, 5568 (1999).
  21. C. R. Schwantes and V. S. Pande, Journal of chemical theory and computation 9, 2000 (2013).
  22. J. McCarty and M. Parrinello, The Journal of chemical physics 147 (2017).
  23. E. N. Lorenz, Journal of atmospheric sciences 20, 130 (1963).
  24. C.-C. Hu and P. J. Van Leeuwen, Quarterly Journal of the Royal Meteorological Society 147, 2352 (2021).
  25. J. C. Butcher, Applied numerical mathematics 20, 247 (1996).
  26. https://github.com/davidaristoff/FKMD/tree/main.
  27. M. Pachitariu and C. Stringer, Nature methods 19, 1634 (2022).
  28. A. S. Christensen and O. A. Von Lilienfeld, Machine Learning: Science and Technology 1, 045018 (2020).
  29. F. Noé and E. Rosta, The Journal of chemical physics 151 (2019).

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 2 tweets with 0 likes about this paper.