Preserving Conservation Laws in the Time-Evolving Natural Gradient Method via Relaxation and Projection Techniques
Abstract: Neural networks have demonstrated significant potential in solving partial differential equations (PDEs). While global approaches such as Physics-Informed Neural Networks (PINNs) offer promising capabilities, they often lack inherent temporal causality, which can limit their accuracy and stability for time-dependent problems. In contrast, local training frameworks that progressively update network parameters over time are naturally suited for evolving PDEs. However, a critical challenge remains: many physical systems possess intrinsic invariants -- such as energy or mass -- that must be preserved to ensure physically meaningful solutions. This paper addresses this challenge by enhancing the Time-Evolving Natural Gradient (TENG) method, a recently proposed local training framework. We introduce two complementary techniques: (i) a relaxation algorithm that ensures the target solution $u_{\text{target}}$ preserves both quadratic and general nonlinear invariants of the original system, providing a structure-preserving learning target; and (ii) a projection technique that maps the updated network parameters $θ(t)$ back onto the invariant manifold, ensuring the final neural network solution strictly adheres to the conservation laws. Numerical experiments on the inviscid Burgers equation, Korteweg-de Vries equation, and acoustic wave equation demonstrate that our proposed approach significantly improves conservation properties while maintaining high accuracy.
Paper Prompts
Sign up for free to create and run prompts on this paper using GPT-5.
Top Community Prompts
Collections
Sign up for free to add this paper to one or more collections.