Papers
Topics
Authors
Recent
Search
2000 character limit reached

Synthetic data generation for system identification: leveraging knowledge transfer from similar systems

Published 8 Mar 2024 in cs.LG, cs.AI, cs.SY, and eess.SY | (2403.05164v1)

Abstract: This paper addresses the challenge of overfitting in the learning of dynamical systems by introducing a novel approach for the generation of synthetic data, aimed at enhancing model generalization and robustness in scenarios characterized by data scarcity. Central to the proposed methodology is the concept of knowledge transfer from systems within the same class. Specifically, synthetic data is generated through a pre-trained meta-model that describes a broad class of systems to which the system of interest is assumed to belong. Training data serves a dual purpose: firstly, as input to the pre-trained meta model to discern the system's dynamics, enabling the prediction of its behavior and thereby generating synthetic output sequences for new input sequences; secondly, in conjunction with synthetic data, to define the loss function used for model estimation. A validation dataset is used to tune a scalar hyper-parameter balancing the relative importance of training and synthetic data in the definition of the loss function. The same validation set can be also used for other purposes, such as early stopping during the training, fundamental to avoid overfitting in case of small-size training datasets. The efficacy of the approach is shown through a numerical example that highlights the advantages of integrating synthetic data into the system identification process.

Definition Search Book Streamline Icon: https://streamlinehq.com
References (12)
  1. C. Shorten and T. M. Khoshgoftaar, “A survey on image data augmentation for deep learning,” Journal of big data, vol. 6, no. 1, pp. 1–48, 2019.
  2. S. Formentin, M. Mazzoleni, M. Scandella, and F. Previdi, “Nonlinear system identification via data augmentation,” Systems & Control Letters, vol. 128, pp. 56–63, 2019.
  3. K. Wakita, Y. Miyauchi, Y. Akimoto, and A. Maki, “Data augmentation methods of parameter identification of a dynamic model for harbor maneuvers,” arXiv preprint arXiv:2305.18851, 2023.
  4. M. Raissi, P. Perdikaris, and G. E. Karniadakis, “Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations,” Journal of Computational physics, vol. 378, pp. 686–707, 2019.
  5. M. Forgione, F. Pura, and D. Piga, “From system models to class models: An in-context learning paradigm,” IEEE Control Systems Letters, pp. 1–1, 2023.
  6. A. Vaswani, N. Shazeer, N. Parmar, J. Uszkoreit, L. Jones, A. N. Gomez, Ł. Kaiser, and I. Polosukhin, “Attention is all you need,” Advances in neural information processing systems, vol. 30, 2017.
  7. J. Achiam, S. Adler, S. Agarwal, L. Ahmad, I. Akkaya, F. L. Aleman, D. Almeida, J. Altenschmidt, S. Altman, S. Anadkat et al., “GPT-4 technical report,” arXiv preprint arXiv:2303.08774, 2023.
  8. H. Touvron, T. Lavril, G. Izacard, X. Martinet, M.-A. Lachaux, T. Lacroix, B. Rozière, N. Goyal, E. Hambro, F. Azhar et al., “Llama: Open and efficient foundation language models,” arXiv preprint arXiv:2302.13971, 2023.
  9. D. Piga, F. Pura, and M. Forgione, “On the adaptation of in-context learners for system identification,” arXiv preprint arXiv:2312.04083, 2023.
  10. D. Piga, “Synthetic data generation,” https://github.com/dariopi/synthetic_data_generation, 2024, accessed February 29, 2024.
  11. M. Forgione and D. Piga, “dynoNet: A neural network architecture for learning dynamical systems,” International Journal of Adaptive Control and Signal Processing, vol. 35, no. 4, pp. 612–626, 2021.
  12. F. Girosi, M. Jones, and T. Poggio, “Regularization theory and neural networks architectures,” Neural computation, vol. 7, no. 2, pp. 219–269, 1995.
Citations (1)

Summary

Whiteboard

No one has generated a whiteboard explanation for this paper yet.

Open Problems

We haven't generated a list of open problems mentioned in this paper yet.

Continue Learning

We haven't generated follow-up questions for this paper yet.

Collections

Sign up for free to add this paper to one or more collections.

Tweets

Sign up for free to view the 1 tweet with 1 like about this paper.