Papers
Topics
Authors
Recent
Search
2000 character limit reached

A Riemannian Framework for Learning Reduced-order Lagrangian Dynamics

Published 24 Oct 2024 in cs.LG | (2410.18868v3)

Abstract: By incorporating physical consistency as inductive bias, deep neural networks display increased generalization capabilities and data efficiency in learning nonlinear dynamic models. However, the complexity of these models generally increases with the system dimensionality, requiring larger datasets, more complex deep networks, and significant computational effort. We propose a novel geometric network architecture to learn physically-consistent reduced-order dynamic parameters that accurately describe the original high-dimensional system behavior. This is achieved by building on recent advances in model-order reduction and by adopting a Riemannian perspective to jointly learn a non-linear structure-preserving latent space and the associated low-dimensional dynamics. Our approach enables accurate long-term predictions of the high-dimensional dynamics of rigid and deformable systems with increased data efficiency by inferring interpretable and physically-plausible reduced Lagrangian models.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (38)
  1. Optimization Algorithms on Matrix Manifolds. Princeton University Press, 2007. URL https://press.princeton.edu/absil.
  2. Quadratic approximation manifold for mitigating the Kolmogorov barrier in nonlinear projection-based model order reduction. Journal of Computational Physics, 464, 2022. doi: https://doi.org/10.1016/j.jcp.2022.111348.
  3. Riemannian adaptive optimization methods. In Intl. Conf. on Learning Representations (ICLR), 2019. URL https://openreview.net/forum?id=r1eiqi09K7.
  4. Rajendra Bhatia. Positive Definite Matrices. Princeton University Press, 2007. ISBN 0691168253.
  5. Which priors matter? Benchmarking models for learning latent dynamics. In Proceedings of the Neural Information Processing Systems Track on Datasets and Benchmarks, volumeĀ 1, 2021. URL https://datasets-benchmarks-proceedings.neurips.cc/paper_files/paper/2021/file/f033ab37c30201f73f142449d037028d-Paper-round1.pdf.
  6. Nicolas Boumal. An introduction to optimization on smooth manifolds. Cambridge University Press, 2023. URL http://www.nicolasboumal.net/book.
  7. Symplectic model reduction of Hamiltonian systems on nonlinear manifolds and approximation with weakly symplectic autoencoder. SIAM Journal of Scientific Computing, 45:A289–A311, 2023. doi: 10.1137/21m1466657.
  8. Model reduction on manifolds: A differential geometric framework. arXiv preprint arXiv:2312.01963, 2024. URL https://arxiv.org/abs/2312.01963.
  9. Preserving Lagrangian structure in nonlinear model reduction with application to structural dynamics. SIAM Journal on Scientific Computing, 37(2):B153–B184, 2015. doi: 10.1137/140959602.
  10. Learning Hamiltonians of constrained mechanical systems. Journal of Computational and Applied Mathematics, 417:114608, 2023. doi: 10.1016/j.cam.2022.114608.
  11. Neural ordinary differential equations. In Neural Information Processing Systems (NeurIPS), volumeĀ 31, 2018. URL https://proceedings.neurips.cc/paper_files/paper/2018/file/69386f6bb1dfed68692a24c8686939b9-Paper.pdf.
  12. Lagrangian neural networks. In ICLR Deep Differential Equations Workshop, 2020. URL https://arxiv.org/abs/2003.04630.
  13. Hamiltonian-based neural ODE networks on the SE(3) manifold for dynamics learning and control. In Robotics: Science and Systems (R:SS), 2021. doi: 10.15607/RSS.2021.XVII.086.
  14. Physically consistent learning of conservative Lagrangian systems with Gaussin processes. In IEEE Conference on Decision and Control (CDC), pp.Ā 4078–4085, 2022. doi: 10.1109/CDC51059.2022.9993123.
  15. Structure‐preserving, stability, and accuracy properties of the energy‐conserving sampling and weighting method for the hyper reduction of nonlinear finite element dynamic models. Intl. Journal for Numerical Methods in Engineering, 102, 2015. doi: 10.1002/nme.4820.
  16. Simplifying Hamiltonian and Lagrangian neural networks via explicit constraints. In Neural Information Processing Systems (NeurIPS), volumeĀ 33, pp.Ā  13880–13889, 2020. URL https://proceedings.neurips.cc/paper_files/paper/2020/file/9f655cc8884fda7ad6d8a6fb15cc001e-Paper.pdf.
  17. Hamiltonian neural networks. In Neural Information Processing Systems (NeurIPS), volumeĀ 32, 2019. URL https://proceedings.neurips.cc/paper_files/paper/2019/file/26cd8ecadce0d4efd6cc8a8725cbd1f8-Paper.pdf.
  18. Structure-preserving model order reduction of Hamiltonian systems. In Intl. Congress of Mathematicians (ICM), volumeĀ 7, pp.Ā 5072–5097, 2022. doi: 10.4171/ICM2022/100.
  19. Learning contact dynamics using physically structured neural networks. In Intl. Conf. on Artificial Intelligence and Statistic (AISTATS), pp.Ā  2152–2160, 2021. URL http://proceedings.mlr.press/v130/hochlehnert21a/hochlehnert21a.pdf.
  20. A Riemannian network for spd matrix learning. In AAAI Conf. on Artificial Intelligence, 2017. URL https://ojs.aaai.org/index.php/AAAI/article/view/10866/10725.
  21. Geoopt: Riemannian optimization in PyTorch. arXiv:2005.02819, 2020. URL https://github.com/geoopt/geoopt.
  22. Model reduction of dynamical systems on nonlinear manifolds using deep convolutional autoencoders. Journal of Computational Physics, 404:108973, 2020. doi: 10.1016/j.jcp.2019.108973.
  23. Neural autoencoder-based structure-preserving model order reduction and control design for high-dimensional physical systems. IEEE Control Systems Letters, 8:133–138, 2024. doi: 10.1109/LCSYS.2023.3344286.
  24. Vector-valued distance and gyrocalculus on the space of symmetric positive definite matrices. In Neural Information Processing Systems (NeurIPS), volumeĀ 34, pp.Ā  18350–18366, 2021. URL https://proceedings.neurips.cc/paper_files/paper/2021/file/98c39996bf1543e974747a2549b3107c-Paper.pdf.
  25. Combining physics and deep learning to learn continuous-time dynamics models. Intl. Journal of Robotics Research, 42(3):83–107, 2023. doi: 10.1177/02783649231169492.
  26. Deep Lagrangian networks: Using physics as model prior for deep learning. In Intl. Conf. on Learning Representations (ICLR), 2019. URL https://openreview.net/pdf?id=BklHpjCqKm.
  27. XuanĀ Son Nguyen. The gyro-structure of some matrix manifolds. In Neural Information Processing Systems (NeurIPS), volumeĀ 35, pp.Ā  26618–26630. Curran Associates, Inc., 2022. URL https://proceedings.neurips.cc/paper_files/paper/2022/file/a9ad92a81748a31ef6f2ef68d775da46-Paper-Conference.pdf.
  28. Matrix manifold neural networks++. In The Twelfth International Conference on Learning Representations, 2024. URL https://openreview.net/forum?id=30aSE3FB3L.
  29. Learning nonlinear projections for reduced-order modeling of dynamical systems using constrained autoencoders. Chaos: An Interdisciplinary Journal of Nonlinear Science, 33(11), 2023. doi: 10.1063/5.0169688.
  30. A Riemannian framework for tensor computing. International Journal of Computer Vision, 66(1):41–66, 2006. doi: 10.1007/s11263-005-3222-z.
  31. Model Order Reduction: Theory, Research Aspects and Applications, volumeĀ 13 of Mathematics in Industry. Springer Verlag, 2008. doi: 10.1007/978-3-540-78841-6.
  32. Symplectic model reduction of Hamiltonian systems using data-driven quadratic manifolds. Computer Methods in Applied Mechanics and Engineering, 417:116402, 2023. doi: 10.1016/j.cma.2023.116402.
  33. Hyperbolic neural networks++. In Intl. Conf. on Learning Representations (ICLR), 2021. URL https://openreview.net/forum?id=Ec85b0tUwbA.
  34. Symplectic spectrum Gaussin processes: Learning Hamiltonians from noisy and sparse data. In Neural Information Processing Systems (NeurIPS), volumeĀ 35, pp.Ā  20795–20808, 2022. URL https://proceedings.neurips.cc/paper_files/paper/2022/file/82f05a105c928c10706213952bf0c8b7-Paper-Conference.pdf.
  35. Control design for soft robots based on reduced-order model. IEEE Robotics and Automation Letters, 4(1), 2019. doi: 10.1109/LRA.2018.2876734.
  36. Mujoco: A physics engine for model-based control. In IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), pp.Ā  5026–5033, 2012. doi: 10.1109/IROS.2012.6386109.
  37. Unsupervised learning of Lagrangian dynamics from images for prediction and control. In Neural Information Processing Systems (NeurIPS), volumeĀ 33, pp.Ā  10741–10752, 2020. URL https://proceedings.neurips.cc/paper_files/paper/2020/file/79f56e5e3e0e999b3c139f225838d41f-Paper.pdf.
  38. Extending Lagrangian and Hamiltonian neural networks with differentiable contact models. In Neural Information Processing Systems (NeurIPS), volumeĀ 34, pp.Ā  21910–21922, 2021. URL https://proceedings.neurips.cc/paper_files/paper/2021/file/b7a8486459730bea9569414ef76cf03f-Paper.pdf.

Summary

No one has generated a summary of this paper yet.

Paper to Video (Beta)

No one has generated a video about this paper yet.

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.