Extracting Koopman Operators for Prediction and Control of Non-linear Dynamics Using Two-stage Learning and Oblique Projections
Abstract: The Koopman operator framework provides a perspective that non-linear dynamics can be described through the lens of linear operators acting on function spaces. As the framework naturally yields linear embedding models, there have been extensive efforts to utilize it for control, where linear controller designs can be applied to control possibly nonlinear dynamics. However, it is challenging to successfully deploy this modeling procedure in a wide range of applications. In this work, some of the fundamental limitations of linear embedding models are addressed. We show a necessary condition for a linear embedding model to achieve zero modeling error, highlighting a trade-off relation between the model expressivity and a restriction on the model structure to allow the use of linear systems theories for nonlinear dynamics. To achieve good performance despite this trade-off, neural network-based modeling is proposed based on linear embedding with oblique projection, which is derived from a weak formulation of projection-based linear operator learning. We train the proposed model using a two-stage learning procedure, wherein the features and operators are initialized with orthogonal projection, followed by the main training process in which test functions characterizing the oblique projection are learned from data. The first stage achieves an optimality ensured by the orthogonal projection and the second stage improves the generalizability to various tasks by optimizing the model with the oblique projection. We demonstrate the effectiveness of the proposed method over other data-driven modeling methods by providing comprehensive numerical evaluations where four tasks are considered targeting three different systems.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.